Science.gov

Sample records for practical analysis method

  1. Analysing clinical practice guidelines. A method of documentary analysis.

    PubMed

    Appleton, J V; Cowley, S

    1997-05-01

    This paper will describe a method of documentary analysis used in a study examining the validity of clinical guidelines issued to health visitors to assist them in identifying families requiring increased health visitor support. This forms the preliminary work for a wider study examining how health visitors decide to increase support to vulnerable families. Although a number of published research texts discuss the value of records and documents as important data sources for health service researchers, there is relatively little information available about the processes of documentary analysis. This paper offers one method for analysing clinical practice guidelines, it describes the development of a critique and analysis tool and explores the strengths and weaknesses of this particular analysis instrument.

  2. A practical method for the analysis of meteor spectra

    NASA Astrophysics Data System (ADS)

    Dubs, Martin; Schlatter, Peter

    2015-08-01

    The analysis of meteor spectra (photographic, CCD or video recording) is complicated by the fact that spectra obtained with objective gratings are curved and have a nonlinear dispersion. In this paper it is shown that with a simple image transformation the spectra can be linearized in such a way that individual spectra over the whole image plane are parallel and have a constant, linear dispersion. This simplifies the identification and measurement of meteor spectral lines. A practical method is given to determine the required image transformation.

  3. A Practical Method of Policy Analysis by Simulating Policy Options

    ERIC Educational Resources Information Center

    Phelps, James L.

    2011-01-01

    This article focuses on a method of policy analysis that has evolved from the previous articles in this issue. The first section, "Toward a Theory of Educational Production," identifies concepts from science and achievement production to be incorporated into this policy analysis method. Building on Kuhn's (1970) discussion regarding paradigms, the…

  4. Practical Use of Computationally Frugal Model Analysis Methods

    SciTech Connect

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugal methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts

  5. Practical Use of Computationally Frugal Model Analysis Methods

    DOE PAGESBeta

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less

  6. Correcting working postures in industry: A practical method for analysis.

    PubMed

    Karhu, O; Kansi, P; Kuorinka, I

    1977-12-01

    A practical method for identifying and evaluating poor working postures, ie, the Ovako Working Posture Analysing System (OWAS), is presented. The method consists of two parts. The first is an observational technique for evaluating working postures. It can be used by work-study engineers in their daily routine and it gives reliable results after a short training period. The second part of the method is a set of criteria for the redesign of working methods and places. The criteria are based on evaluations made by experienced workers and ergonomics experts. They take into consideration factors such as health and safety, but the main emphasis is placed on the discomfort caused by the working postures. The method has been extensively used in the steel company which participated in its development. Complete production lines have already been redesigned on the basis of information gathered from OWAS, the result being more comfortable workplaces as well as a positive effect on production quality.

  7. A Practical Method of Policy Analysis by Estimating Effect Size

    ERIC Educational Resources Information Center

    Phelps, James L.

    2011-01-01

    The previous articles on class size and other productivity research paint a complex and confusing picture of the relationship between policy variables and student achievement. Missing is a conceptual scheme capable of combining the seemingly unrelated research and dissimilar estimates of effect size into a unified structure for policy analysis and…

  8. Methods and practices used in incident analysis in the Finnish nuclear power industry.

    PubMed

    Suksi, Seija

    2004-07-26

    Finnish nuclear power plant operators Teollisuuden Voima Oy (TVO) and Fortum Power and Heat Oy (Fortum) was carried out by the Technical Research Centre (VTT) on request of STUK at the end of 1990s. The study aimed at providing a broad overview and suggestions for improvement of the whole organisational framework to support event investigation practices at the regulatory body and at the utilities. The main objective of the research was to evaluate the adequacy and reliability of event investigation analysis methods and practices in the Finnish nuclear power industry and based on the results to further develop them. The results and suggestions of the research are reviewed in the paper and the corrective actions implemented in event investigation and operating experience procedures both at STUK and at utilities are discussed as well. STUK has developed its own procedure for the risk-informed analysis of nuclear power plant events. The PSA based event analysis method is used to assess the safety significance and importance measures associated with the unavailability of components and systems subject to Technical Specifications. The insights from recently performed PSA based analyses are also briefly discussed in the paper.

  9. Practical Methods for the Analysis of Voltage Collapse in Electric Power Systems: a Stationary Bifurcations Viewpoint.

    NASA Astrophysics Data System (ADS)

    Jean-Jumeau, Rene

    1993-03-01

    Voltage collapse (VC) is generally caused by either of two types of system disturbances: load variations and contingencies. In this thesis, we study VC resulting from load variations. This is termed static voltage collapse. This thesis deals with this type of voltage collapse in electrical power systems by using a stationary bifurcations viewpoint by associating it with the occurrence of saddle node bifurcations (SNB) in the system. Approximate models are generically used in most VC analyses. We consider the validity of these models for the study of SNB and, thus, of voltage collapse. We justify the use of saddle node bifurcation as a model for VC in power systems. In particular, we prove that this leads to definition of a model and--since load demand is used as a parameter for that model--of a mode of parameterization of that model in order to represent actual power demand variations within the power system network. Ill-conditioning of the set of nonlinear equations defining a dynamical system is a generic occurence near the SNB point. We suggest a reparameterization of the set of nonlinear equations which allows to avoid this problem. A new indicator for the proximity of voltage collapse, the voltage collapse index (VCI), is developed. A new (n + 1)-dimensional set of characteristic equations for the computation of the exact SNB point, replacing the standard (2n + 1)-dimensional one is presented for general parameter -dependent nonlinear dynamical systems. These results are then applied to electric power systems for the analysis and prediction of voltage collapse. The new methods offer the potential of faster computation and greater flexibility. For reasons of theoretical development and clarity, the preceding methodologies are developed under the assumption of the absence of constraints on the system parameters and states, and the full differentiability of the functions defining the power system model. In the latter part of this thesis, we relax these

  10. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory?Determination of Trihalomethane Formation Potential, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Fram, Miranda S.; Bush, Noel

    2004-01-01

    An analytical method for the determination of the trihalomethane formation potential of water samples has been developed. The trihalomethane formation potential is measured by dosing samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine, and then analyzing the resulting trihalomethanes by purge and trap/gas chromatography equipped with an electron capture detector. Detailed explanations of the method and quality-control practices are provided. Method validation experiments showed that the trihalomethane formation potential varies as a function of time between sample collection and analysis, residual-free chlorine concentration, method of sample dilution, and the concentration of bromide in the sample.

  11. ISO 14 001 at the farm level: analysis of five methods for evaluating the environmental impact of agricultural practices.

    PubMed

    Galan, M B; Peschard, D; Boizard, H

    2007-02-01

    Faced with society's increasing expectations, the Common Agricultural Policy (CAP) review considers environmental management to be an ever more critical criterion in the allocation of farm subsidies. With the goal of evaluating the environmental friendliness of farm practices, France's agricultural research and extension services have built a range of agricultural/environmental diagnostic tools over recent years. The objective of the present paper is to compare the five tools most frequently used in France: IDEA, DIAGE, DIALECTE, DIALOGUE and INDIGO. All the tools have the same purpose: evaluation of the impact of farm practices on the environment via indicators and monitoring of farm management practices. When tested on a sample of large-scale farms in Picardie, the five tools sometimes produced completely different results: for a given farm, the most supposedly significant environmental impacts depend on the tool used. These results lead to differing environmental management plans and raise the question of the methods' pertinence. An analysis grid of diagnostic tools aimed at specifying their field of validity, limits and relevance was drawn up. The resulting comparative analysis enables to define each tool's domain of validity and allows to suggest lines of thought for developing more relevant tools for (i) evaluating a farm's environmental performance and (ii) helping farmers to develop a plan for improving practices within the framework of an environmental management system.

  12. Animal Disease Import Risk Analysis--a Review of Current Methods and Practice.

    PubMed

    Peeler, E J; Reese, R A; Thrush, M A

    2015-10-01

    The application of risk analysis to the spread of disease with international trade in animals and their products, that is, import risk analysis (IRA), has been largely driven by the Sanitary and Phytosanitary (SPS) agreement of the World Trade Organization (WTO). The degree to which the IRA standard established by the World Organization for Animal Health (OIE), and associated guidance, meets the needs of the SPS agreement is discussed. The use of scenario trees is the core modelling approach used to represent the steps necessary for the hazard to occur. There is scope to elaborate scenario trees for commodity IRA so that the quantity of hazard at each step is assessed, which is crucial to the likelihood of establishment. The dependence between exposure and establishment suggests that they should fall within the same subcomponent. IRA undertaken for trade reasons must include an assessment of consequences to meet SPS criteria, but guidance is sparse. The integration of epidemiological and economic modelling may open a path for better methods. Matrices have been used in qualitative IRA to combine estimates of entry and exposure, and consequences with likelihood, but this approach has flaws and better methods are needed. OIE IRA standards and guidance indicate that the volume of trade should be taken into account, but offer no detail. Some published qualitative IRAs have assumed current levels and patterns of trade without specifying the volume of trade, which constrains the use of IRA to determine mitigation measures (to reduce risk to an acceptable level) and whether the principle of equivalence, fundamental to the SPS agreement, has been observed. It is questionable whether qualitative IRA can meet all the criteria set out in the SPS agreement. Nevertheless, scope exists to elaborate the current standards and guidance, so they better serve the principle of science-based decision-making.

  13. Statistical Methods for the Analysis of Discrete Choice Experiments: A Report of the ISPOR Conjoint Analysis Good Research Practices Task Force.

    PubMed

    Hauber, A Brett; González, Juan Marcos; Groothuis-Oudshoorn, Catharina G M; Prior, Thomas; Marshall, Deborah A; Cunningham, Charles; IJzerman, Maarten J; Bridges, John F P

    2016-06-01

    Conjoint analysis is a stated-preference survey method that can be used to elicit responses that reveal preferences, priorities, and the relative importance of individual features associated with health care interventions or services. Conjoint analysis methods, particularly discrete choice experiments (DCEs), have been increasingly used to quantify preferences of patients, caregivers, physicians, and other stakeholders. Recent consensus-based guidance on good research practices, including two recent task force reports from the International Society for Pharmacoeconomics and Outcomes Research, has aided in improving the quality of conjoint analyses and DCEs in outcomes research. Nevertheless, uncertainty regarding good research practices for the statistical analysis of data from DCEs persists. There are multiple methods for analyzing DCE data. Understanding the characteristics and appropriate use of different analysis methods is critical to conducting a well-designed DCE study. This report will assist researchers in evaluating and selecting among alternative approaches to conducting statistical analysis of DCE data. We first present a simplistic DCE example and a simple method for using the resulting data. We then present a pedagogical example of a DCE and one of the most common approaches to analyzing data from such a question format-conditional logit. We then describe some common alternative methods for analyzing these data and the strengths and weaknesses of each alternative. We present the ESTIMATE checklist, which includes a list of questions to consider when justifying the choice of analysis method, describing the analysis, and interpreting the results.

  14. Statistical Methods for the Analysis of Discrete Choice Experiments: A Report of the ISPOR Conjoint Analysis Good Research Practices Task Force.

    PubMed

    Hauber, A Brett; González, Juan Marcos; Groothuis-Oudshoorn, Catharina G M; Prior, Thomas; Marshall, Deborah A; Cunningham, Charles; IJzerman, Maarten J; Bridges, John F P

    2016-06-01

    Conjoint analysis is a stated-preference survey method that can be used to elicit responses that reveal preferences, priorities, and the relative importance of individual features associated with health care interventions or services. Conjoint analysis methods, particularly discrete choice experiments (DCEs), have been increasingly used to quantify preferences of patients, caregivers, physicians, and other stakeholders. Recent consensus-based guidance on good research practices, including two recent task force reports from the International Society for Pharmacoeconomics and Outcomes Research, has aided in improving the quality of conjoint analyses and DCEs in outcomes research. Nevertheless, uncertainty regarding good research practices for the statistical analysis of data from DCEs persists. There are multiple methods for analyzing DCE data. Understanding the characteristics and appropriate use of different analysis methods is critical to conducting a well-designed DCE study. This report will assist researchers in evaluating and selecting among alternative approaches to conducting statistical analysis of DCE data. We first present a simplistic DCE example and a simple method for using the resulting data. We then present a pedagogical example of a DCE and one of the most common approaches to analyzing data from such a question format-conditional logit. We then describe some common alternative methods for analyzing these data and the strengths and weaknesses of each alternative. We present the ESTIMATE checklist, which includes a list of questions to consider when justifying the choice of analysis method, describing the analysis, and interpreting the results. PMID:27325321

  15. Empowering Discourse: Discourse Analysis as Method and Practice in the Sociology Classroom

    ERIC Educational Resources Information Center

    Hjelm, Titus

    2013-01-01

    Collaborative learning and critical pedagogy are widely recognized as "empowering" pedagogies for higher education. Yet, the practical implementation of both has a mixed record. The question, then, is: How could collaborative and critical pedagogies be empowered themselves? This paper makes a primarily theoretical case for discourse…

  16. APA's Learning Objectives for Research Methods and Statistics in Practice: A Multimethod Analysis

    ERIC Educational Resources Information Center

    Tomcho, Thomas J.; Rice, Diana; Foels, Rob; Folmsbee, Leah; Vladescu, Jason; Lissman, Rachel; Matulewicz, Ryan; Bopp, Kara

    2009-01-01

    Research methods and statistics courses constitute a core undergraduate psychology requirement. We analyzed course syllabi and faculty self-reported coverage of both research methods and statistics course learning objectives to assess the concordance with APA's learning objectives (American Psychological Association, 2007). We obtained a sample of…

  17. Practical Thermal Evaluation Methods For HAC Fire Analysis In Type B Radiaoactive Material (RAM) Packages

    SciTech Connect

    Abramczyk, Glenn; Hensel, Stephen J; Gupta, Narendra K.

    2013-03-28

    Title 10 of the United States Code of Federal Regulations Part 71 for the Nuclear Regulatory Commission (10 CFR Part 71.73) requires that Type B radioactive material (RAM) packages satisfy certain Hypothetical Accident Conditions (HAC) thermal design requirements to ensure package safety during accidental fire conditions. Compliance with thermal design requirements can be met by prototype tests, analyses only or a combination of tests and analyses. Normally, it is impractical to meet all the HAC using tests only and the analytical methods are too complex due to the multi-physics non-linear nature of the fire event. Therefore, a combination of tests and thermal analyses methods using commercial heat transfer software are used to meet the necessary design requirements. The authors, along with his other colleagues at Savannah River National Laboratory in Aiken, SC, USA, have successfully used this 'tests and analyses' approach in the design and certification of several United States' DOE/NNSA certified packages, e.g. 9975, 9977, 9978, 9979, H1700, and Bulk Tritium Shipping Package (BTSP). This paper will describe these methods and it is hoped that the RAM Type B package designers and analysts can use them for their applications.

  18. Methods of cognitive analysis to support the design and evaluation of biomedical systems: the case of clinical practice guidelines.

    PubMed

    Patel, V L; Arocha, J F; Diermeier, M; Greenes, R A; Shortliffe, E H

    2001-02-01

    This article provides a theoretical and methodological framework for the use of cognitive analysis to support the representation of biomedical knowledge and the design of clinical systems, using clinical-practice guidelines (CPGs) as an example. We propose that propositional and semantic analyses, when used as part of the system-development process, can improve the validity, usability, and comprehension of the resulting biomedical applications. The framework we propose is based on a large body of research on the study of how people mentally represent information and subsequently use it for problem solving. This research encompasses many areas of psychology, but the more important ones are the study of memory and the study of comprehension. Of particular relevance is research devoted to investigating the comprehension and memory of language, expressed verbally or in text. In addition, research on how contextual variables affect performance is informative because these psychological processes are influenced by situational variables (e.g., setting, culture). One important factor limiting the acceptance and use of clinical-practice guidelines (CPGs) may be the mismatch between a guideline's recommended actions and the physician-user's mental models of what seems appropriate in a given case. Furthermore, CPGs can be semantically complex, often composed of elaborate collections of prescribed procedures with logical gaps or contradictions that can promote ambiguity and hence frustration on the part of those who attempt to use them. An improved understanding of the semantics and structure of CPGs may help to improve such matching, and ultimately the comprehensibility and usability of CPGs. Cognitive methods of analysis can help guideline designers and system builders throughout the development process, from the conceptual design of a computer-based system to its implementation phases. By studying how guideline creators and developers represent guidelines, both mentally and

  19. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: practical guidelines for the crystalline sponge method

    SciTech Connect

    Ramadhar, Timothy R.; Zheng, Shao-Liang; Chen, Yu-Sheng

    2015-01-01

    This report describes complete practical guidelines and insights for the crystalline sponge method, which have been derived through the first use of synchrotron radiation on these systems, and includes a procedure for faster synthesis of the sponges. These guidelines will be applicable to crystal sponge data collected at synchrotrons or in-house facilities, and will allow researchers to obtain reliable high-quality data and construct chemically and physically sensible models for guest structural determination. A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine

  20. Psychiatrists' follow-up of identified metabolic risk: a mixed-method analysis of outcomes and influences on practice

    PubMed Central

    Patterson, Sue; Freshwater, Kathleen; Goulter, Nicole; Ewing, Julie; Leamon, Boyd; Choudhary, Anand; Moudgil, Vikas; Emmerson, Brett

    2016-01-01

    Aims and method To describe and explain psychiatrists' responses to metabolic abnormalities identified during screening. We carried out an audit of clinical records to assess rates of monitoring and follow-up practice. Semi-structured interviews with 36 psychiatrists followed by descriptive and thematic analyses were conducted. Results Metabolic abnormalities were identified in 76% of eligible patients screened. Follow-up, recorded for 59%, was variable but more likely with four or more abnormalities. Psychiatrists endorse guidelines but ambivalence about responsibility, professional norms, resource constraints and skills deficits as well as patient factors influences practice. Therapeutic optimism and desire to be a ‘good doctor’ supported comprehensive follow-up. Clinical implications Psychiatrists are willing to attend to physical healthcare, and obstacles to recommended practice are surmountable. Psychiatrists seek consensus among stakeholders about responsibilities and a systemic approach addressing the social determinants of health inequities. Understanding patients' expectations is critical to promoting best practice. PMID:27752343

  1. Qualitative Approaches to Mixed Methods Practice

    ERIC Educational Resources Information Center

    Hesse-Biber, Sharlene

    2010-01-01

    This article discusses how methodological practices can shape and limit how mixed methods is practiced and makes visible the current methodological assumptions embedded in mixed methods practice that can shut down a range of social inquiry. The article argues that there is a "methodological orthodoxy" in how mixed methods is practiced that…

  2. BOOK REVIEW: Vortex Methods: Theory and Practice

    NASA Astrophysics Data System (ADS)

    Cottet, G.-H.; Koumoutsakos, P. D.

    2001-03-01

    The book Vortex Methods: Theory and Practice presents a comprehensive account of the numerical technique for solving fluid flow problems. It provides a very nice balance between the theoretical development and analysis of the various techniques and their practical implementation. In fact, the presentation of the rigorous mathematical analysis of these methods instills confidence in their implementation. The book goes into some detail on the more recent developments that attempt to account for viscous effects, in particular the presence of viscous boundary layers in some flows of interest. The presentation is very readable, with most points illustrated with well-chosen examples, some quite sophisticated. It is a very worthy reference book that should appeal to a large body of readers, from those interested in the mathematical analysis of the methods to practitioners of computational fluid dynamics. The use of the book as a text is compromised by its lack of exercises for students, but it could form the basis of a graduate special topics course. Juan Lopez

  3. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory-- Determination of Dissolved Organic Carbon in Water by High Temperature Catalytic Oxidation, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Bird, Susan M.; Fram, Miranda S.; Crepeau, Kathryn L.

    2003-01-01

    An analytical method has been developed for the determination of dissolved organic carbon concentration in water samples. This method includes the results of the tests used to validate the method and the quality-control practices used for dissolved organic carbon analysis. Prior to analysis, water samples are filtered to remove suspended particulate matter. A Shimadzu TOC-5000A Total Organic Carbon Analyzer in the nonpurgeable organic carbon mode is used to analyze the samples by high temperature catalytic oxidation. The analysis usually is completed within 48 hours of sample collection. The laboratory reporting level is 0.22 milligrams per liter.

  4. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: practical guidelines for the crystalline sponge method.

    PubMed

    Ramadhar, Timothy R; Zheng, Shao Liang; Chen, Yu Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal-organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data

  5. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: Practical guidelines for the crystalline sponge method

    DOE PAGESBeta

    Ramadhar, Timothy R.; Zheng, Shao -Liang; Chen, Yu -Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collectionmore » times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high

  6. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: Practical guidelines for the crystalline sponge method

    SciTech Connect

    Ramadhar, Timothy R.; Zheng, Shao -Liang; Chen, Yu -Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of

  7. Analysis of release kinetics of ocular therapeutics from drug releasing contact lenses: Best methods and practices to advance the field.

    PubMed

    Tieppo, Arianna; Boggs, Aarika C; Pourjavad, Payam; Byrne, Mark E

    2014-08-01

    Several methods have been proposed to achieve an extended and controlled release of ocular therapeutics via contact lenses; however, the experimental conditions used to study the drug release vary greatly and significantly influence the release kinetics. In this paper, we examine variations in the release conditions and their effect on the release of both hydrophilic and hydrophobic drugs (ketotifen fumarate, diclofenac sodium, timolol maleate and dexamethasone) from conventional hydrogel and silicone hydrogel lenses. Drug release was studied under different conditions, varying volume, mixing rates, and temperature. Volume had the biggest effect on the release profile, which ironically is the least consistent variable throughout the literature. When a small volume (2-30 mL) was used with no forced mixing and solvent exchange every 24 h, equilibrium was reached promptly much earlier than solvent exchange, significantly damping the drug release rate and artificially extending the release duration, leading to false conclusions. Using a large volume (200-400 mL) with a 30 rpm mixing rate and no solvent exchange, the release rate and total mass released was significantly increased. In general, the release performed in small volumes with no force mixing exhibited cumulative mass release amounts of 3-12 times less than the cumulative release amounts in large volumes with mixing. Increases in mixing rate and temperature resulted in relatively small increases of 1.4 and 1.2 times, respectively in fractional mass released. These results strongly demonstrate the necessity of proper and thorough analysis of release data to assure that equilibrium is not affecting release kinetics. This is paramount for comparison of various controlled drug release methods of therapeutic contact lenses, validation of the potential of lenses as an efficient and effective means of drug delivery, as well as increasing the likelihood of only the most promising methods reaching in vivo studies. PMID

  8. Methods of Genomic Competency Integration in Practice

    PubMed Central

    Jenkins, Jean; Calzone, Kathleen A.; Caskey, Sarah; Culp, Stacey; Weiner, Marsha; Badzek, Laurie

    2015-01-01

    Purpose Genomics is increasingly relevant to health care, necessitating support for nurses to incorporate genomic competencies into practice. The primary aim of this project was to develop, implement, and evaluate a year-long genomic education intervention that trained, supported, and supervised institutional administrator and educator champion dyads to increase nursing capacity to integrate genomics through assessments of program satisfaction and institutional achieved outcomes. Design Longitudinal study of 23 Magnet Recognition Program® Hospitals (21 intervention, 2 controls) participating in a 1-year new competency integration effort aimed at increasing genomic nursing competency and overcoming barriers to genomics integration in practice. Methods Champion dyads underwent genomic training consisting of one in-person kick-off training meeting followed by monthly education webinars. Champion dyads designed institution-specific action plans detailing objectives, methods or strategies used to engage and educate nursing staff, timeline for implementation, and outcomes achieved. Action plans focused on a minimum of seven genomic priority areas: champion dyad personal development; practice assessment; policy content assessment; staff knowledge needs assessment; staff development; plans for integration; and anticipated obstacles and challenges. Action plans were updated quarterly, outlining progress made as well as inclusion of new methods or strategies. Progress was validated through virtual site visits with the champion dyads and chief nursing officers. Descriptive data were collected on all strategies or methods utilized, and timeline for achievement. Descriptive data were analyzed using content analysis. Findings The complexity of the competency content and the uniqueness of social systems and infrastructure resulted in a significant variation of champion dyad interventions. Conclusions Nursing champions can facilitate change in genomic nursing capacity through

  9. [Practical method for morphometric reliefs].

    PubMed

    Loffredo Sampaolo, C; Sampaolo, G; Gagliardi, P E; Alfano, A

    1981-07-30

    This morphometric method consists in: - to photograph, during culture's period, with constant magnification, the organ rudiment explanted in vitro; - to place on the single images, in printing, a transparent squared reticle with size unit known; - to compute the size values of the considered image counting the number of reticle's squares comprised in the image. PMID:7295411

  10. Intermittent hypoxia training as non-pharmacologic therapy for cardiovascular diseases: Practical analysis on methods and equipment.

    PubMed

    Serebrovskaya, Tatiana V; Xi, Lei

    2016-09-01

    The global industrialization has brought profound lifestyle changes and environmental pollutions leading to higher risks of cardiovascular diseases. Such tremendous challenges outweigh the benefits of major advances in pharmacotherapies (such as statins, antihypertensive, antithrombotic drugs) and exacerbate the public healthcare burdens. One of the promising complementary non-pharmacologic therapies is the so-called intermittent hypoxia training (IHT) via activation of the human body's own natural defense through adaptation to intermittent hypoxia. This review article primarily focuses on the practical questions concerning the utilization of IHT as a non-pharmacologic therapy against cardiovascular diseases in humans. Evidence accumulated in the past five decades of research in healthy men and patients has suggested that short-term daily sessions consisting 3-4 bouts of 5-7 min exposures to 12-10% O2 alternating with normoxic durations for 2-3 weeks can result in remarkable beneficial effects in treatment of cardiovascular diseases such as hypertension, coronary heart disease, and heart failure. Special attentions are paid to the therapeutic effects of different IHT models, along with introduction of a variety of specialized facilities and equipment available for IHT, including hypobaric chambers, hypoxia gas mixture deliver equipment (rooms, tents, face masks), and portable rebreathing devices. Further clinical trials and thorough evaluations of the risks versus benefits of IHT are much needed to develop a series of standardized and practical guidelines for IHT. Taken together, we can envisage a bright future for IHT to play a more significant role in the preventive and complementary medicine against cardiovascular diseases. PMID:27407098

  11. Development and application to clinical practice of a validated HPLC method for the analysis of β-glucocerebrosidase in Gaucher disease.

    PubMed

    Colomer, E Gras; Gómez, M A Martínez; Alvarez, A González; Martí, M Climente; Moreno, P León; Zarzoso, M Fernández; Jiménez-Torres, N V

    2014-03-01

    The main objective of our study is to develop a simple, fast and reliable method for measuring β-glucocerebrosidase activity in Gaucher patients leukocytes in clinical practice. This measurement may be a useful marker to drive dose selection and early clinical decision making of enzyme replacement therapy. We measure the enzyme activity by high-performance liquid chromatography with ultraviolet detection and 4-nitrophenyl-β-d-glucopyranoside as substrate. A cohort of eight Gaucher patients treated with enzyme replacement therapy and ten healthy controls were tested; median enzyme activity values was 20.57mU/ml (interquartile range 19.92-21.53mU/ml) in patients and mean was 24.73mU/ml (24.12-25.34mU/ml) in the reference group, which allowed the establishment of the normal range of β-glucocerebrosidase activity. The proposed method for leukocytes glucocerebrosidase activity measuring is fast, easy to use, inexpensive and reliable. Furthermore, significant differences between both populations were observed (p=0.008). This suggests that discerning between patients and healthy individuals and providing an approach to enzyme dosage optimization is feasible. This method could be considered as a decision support tool for clinical monitoring. Our study is a first approach to in depth analysis of enzyme replacement therapy and optimization of dosing therapies. PMID:24447963

  12. Assessment of management in general practice: validation of a practice visit method.

    PubMed Central

    van den Hombergh, P; Grol, R; van den Hoogen, H J; van den Bosch, W J

    1998-01-01

    BACKGROUND: Practice management (PM) in general practice is as yet ill-defined; a systematic description of its domain, as well as a valid method to assess it, are necessary for research and assessment. AIM: To develop and validate a method to assess PM of general practitioners (GPs) and practices. METHOD: Relevant and potentially discriminating indicators were selected from a systematic framework of 2410 elements of PM to be used in an assessment method (VIP = visit instrument PM). The method was first tested in a pilot study and, after revision, was evaluated in order to select discriminating indicators and to determine validity of dimensions (factor and reliability analysis, linear regression). RESULTS: One hundred and ten GPs were assessed with the practice visit method using 249 indicators; 208 of these discriminated sufficiently at practice level or at GP level. Factor analysis resulted in 34 dimensions and in a taxonomy of PM. Dimensions and indicators showed marked variation between GPs and practices. Training practices scored higher on five dimensions; single-handed and dispensing practices scored lower on delegated tasks, but higher on accessibility and availability. CONCLUSION: A visit method to assess PM has been developed and its validity studied systematically. The taxonomy and dimensions of PM were in line with other classifications. Selection of a balanced number of useful and relevant indicators was nevertheless difficult. The dimensions could discriminate between groups of GPs and practices, establishing the value of the method for assessment. The VIP method could be an important contribution to the introduction of continuous quality improvement in the profession. PMID:10198481

  13. Statistical log analysis made practical

    SciTech Connect

    Mitchell, W.K.; Nelson, R.J. )

    1991-06-01

    This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.

  14. Exergy analysis: Principles and practice

    NASA Astrophysics Data System (ADS)

    Moran, M. J.; Sciubba, E.

    1994-04-01

    The importance of the goal of developing systems that effectively use nonrenewable energy resources such as oil, natural gas, and coal is apparent. The method of exergy analysis is well suited for furthering this goal, for it enables the location, type and true magnitude of waste and loss to be determined. Such information can be used to design new systems and to reduce the inefficiency of existing systems. This paper provides a brief survey of both exergy principles and the current literature of exergy analysis with emphasis on areas of application.

  15. Clinical practice is not applied scientific method.

    PubMed

    Cox, K

    1995-08-01

    Practice is often described as applied science, but real life is far too complex and interactive to be handled by analytical scientific methods. The limitations of usefulness of scientific method in clinical practice result from many factors. The complexity of the large number of ill-defined variables at many levels of the problem. Scientific method focuses on one variable at a time across a hundred identical animals to extract a single, generalizable 'proof' or piece of 'truth'. Clinical practice deals with a hundred variables at one time within one animal from among a clientele of non-identical animals in order to optimize a mix of outcomes intended to satisfy that particular animal's current needs and desires. Interdependence among the variables. Most factors in the illness, the disease, the patient and the setting are interdependent, and cannot be sufficiently isolated to allow their separate study. Practice as a human transaction involving at least two people is too complex to be analysed one factor at a time when the interaction stimulates unpredictable responses. Ambiguous data. Words have many usages. People not only assign different interpretations to the same words, they assign different 'meanings', especially according to the threat or hope they may imply. The perceptual data gleaned from physical examination may be difficult to specify exactly or to confirm objectively. The accuracy and precision of investigational data and their reporting can be low, and are frequently unknown. Differing goals between science and practice. Science strives for exact points of propositional knowledge, verifiable by logical argument using objective data and repetition of the experiment.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:7661793

  16. Evaluation of agricultural best-management practices in the Conestoga River headwaters, Pennsylvania; methods of data collection and analysis and description of study areas

    USGS Publications Warehouse

    Chichester, Douglas C.

    1988-01-01

    The U.S. Geological Survey is conducting a water quality study as part of the nationally implemented Rural Clean Water Program in the headwaters of the Conestoga River, Pennsylvania. The study, which began in 1982, was designed to determine the effect of agricultural best management practices on surface--and groundwater quality. The study was concentrated in four areas within the intensively farmed, carbonate rock terrane located predominately in Lancaster County, Pennsylvania. These areas were divided into three monitoring components: (1) a Regional study area (188 sq mi): (2) a Small Watershed study area (5.82 sq mi); and (3) two field site study areas, Field-Site 1 (22.1 acres) and Field 2 (47.5 acres). The type of water quality data and the methods of data collection and analysis are presented. The monitoring strategy and description of the study areas are discussed. The locations and descriptions for all data collection locations at the four study areas are provided. (USGS)

  17. The Sherlock Holmes method in clinical practice.

    PubMed

    Sopeña, B

    2014-04-01

    This article lists the integral elements of the Sherlock Holmes method, which is based on the intelligent collection of information through detailed observation, careful listening and thorough examination. The information thus obtained is analyzed to develop the main and alternative hypotheses, which are shaped during the deductive process until the key leading to the solution is revealed. The Holmes investigative method applied to clinical practice highlights the advisability of having physicians reason through and seek out the causes of the disease with the data obtained from acute observation, a detailed review of the medical history and careful physical examination.

  18. Selecting Needs Analysis Methods.

    ERIC Educational Resources Information Center

    Newstrom, John W.; Lilyquist, John M.

    1979-01-01

    Presents a contingency model for decision making with regard to needs analysis methods. Focus is on 12 methods with brief discussion of their defining characteristics and some operational guidelines for their use. (JOW)

  19. [Practice marketing. Data analysis of a urological group practice].

    PubMed

    Schneider, T; Schneider, B; Eisenhardt, A; Sperling, H

    2009-07-01

    The urological practice setting in Germany has changed tremendously over the last years. Group practices with two or more urologists working together are becoming more and more popular. At the same time, marketing has become essential even for urologists. To evaluate the patient flow to our group practice, we asked all new patients to fill out a questionnaire (n=2112). We also evaluated the efficacy of our recall system. The analysis showed that patients were 18-93 years old (mean 57 years), 68% being male and 32% female. The largest age group consisted of 41-50-year-olds. The most important reasons for choosing our practice were recommendations by general practitioners in 38%, recommendations by specialists in 11%, and recommendations by friends and relatives in 27%. Five percent of the patients chose the practice because of the Internet home page and 10% because of entries in various phone books. Three percent of the patients came because of newspaper articles about the practice owners, and <1% had attended patient presentations. The Internet was used mainly by 31-40-year-old patients. Our recall system showed an efficacy of 59%. In summary, a good reputation in the medical society as well as in the neighbourhood is still the best advertising for a urological practice. Phone books are increasingly becoming less important, and the Internet is increasingly attractive to the younger population. Recall systems can also be useful for urological practices. PMID:19387608

  20. The casuistic method of practical ethics.

    PubMed

    Spielthenner, Georg

    2016-10-01

    This essay concerns itself with the methodology of practical ethics. There are a variety of methods employed in ethics. Although none have been firmly established as dominant, it is generally agreed that casuistry, or the case-based method, is one important strategy commonly used for resolving ethical issues. Casuists compare the case under consideration to a relevantly similar (analogous) precedent case in which judgements have already been made, and they use these earlier judgements to determine the proper resolution of the present case. In this article, I try to provide a better understanding of the nature of contemporary casuistry. To accomplish this, I explain (in the first section) the basic features of casuistic reasoning. The second section focuses on the logic of casuistry. By assessing casuistic reasoning from the logical viewpoint, casuistry can be reconstructed as a logically correct method of reasoning. The third section looks at casuistry from the epistemic point of view and investigates the justificatory force of casuistic reasoning. Finally, in the fourth section, I show the usefulness of formal casuistry. PMID:27639870

  1. Practical Considerations for Using Exploratory Factor Analysis in Educational Research

    ERIC Educational Resources Information Center

    Beavers, Amy S.; Lounsbury, John W.; Richards, Jennifer K.; Huck, Schuyler W.; Skolits, Gary J.; Esquivel, Shelley L.

    2013-01-01

    The uses and methodology of factor analysis are widely debated and discussed, especially the issues of rotational use, methods of confirmatory factor analysis, and adequate sample size. The variety of perspectives and often conflicting opinions can lead to confusion among researchers about best practices for using factor analysis. The focus of the…

  2. Guidance for using mixed methods design in nursing practice research.

    PubMed

    Chiang-Hanisko, Lenny; Newman, David; Dyess, Susan; Piyakong, Duangporn; Liehr, Patricia

    2016-08-01

    The mixed methods approach purposefully combines both quantitative and qualitative techniques, enabling a multi-faceted understanding of nursing phenomena. The purpose of this article is to introduce three mixed methods designs (parallel; sequential; conversion) and highlight interpretive processes that occur with the synthesis of qualitative and quantitative findings. Real world examples of research studies conducted by the authors will demonstrate the processes leading to the merger of data. The examples include: research questions; data collection procedures and analysis with a focus on synthesizing findings. Based on experience with mixed methods studied, the authors introduce two synthesis patterns (complementary; contrasting), considering application for practice and implications for research. PMID:27397810

  3. An Online Forum As a Qualitative Research Method: Practical Issues

    PubMed Central

    Im, Eun-Ok; Chee, Wonshik

    2008-01-01

    Background Despite positive aspects of online forums as a qualitative research method, very little is known about practical issues involved in using online forums for data collection, especially for a qualitative research project. Objectives The purpose of this paper is to describe the practical issues that the researchers encountered in implementing an online forum as a qualitative component of a larger study on cancer pain experience. Method Throughout the study process, the research staff recorded issues ranged from minor technical problems to serious ethical dilemmas as they arose and wrote memos about them. The memos and written records of discussions were reviewed and analyzed using the content analysis suggested by Weber. Results Two practical issues related to credibility were identified: a high response and retention rate and automatic transcripts. An issue related to dependability was the participants’ easy forgetfulness. The issues related to confirmability were difficulties in theoretical saturation and unstandardized computer and Internet jargon. A security issue related to hacking attempts was noted as well. Discussion The analysis of these issues suggests several implications for future researchers who want to use online forums as a qualitative data collection method. PMID:16849979

  4. Comprehensive rotorcraft analysis methods

    NASA Technical Reports Server (NTRS)

    Stephens, Wendell B.; Austin, Edward E.

    1988-01-01

    The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).

  5. Spiritual Assessment in Counseling: Methods and Practice

    ERIC Educational Resources Information Center

    Oakes, K. Elizabeth; Raphel, Mary M.

    2008-01-01

    Given the widely expanding professional and empirical support for integrating spirituality into counseling, the authors present a practical discussion for raising counselors' general awareness and skill in the critical area of spiritual assessment. A discussion of rationale, measurement, and clinical practice is provided along with case examples.…

  6. Achieving Integration in Mixed Methods Designs—Principles and Practices

    PubMed Central

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-01-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs—exploratory sequential, explanatory sequential, and convergent—and through four advanced frameworks—multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

  7. System based practice: a concept analysis

    PubMed Central

    YAZDANI, SHAHRAM; HOSSEINI, FAKHROLSADAT; AHMADY, SOLEIMAN

    2016-01-01

    Introduction Systems-Based Practice (SBP) is one of the six competencies introduced by the ACGME for physicians to provide high quality of care and also the most challenging of them in performance, training, and evaluation of medical students. This concept analysis clarifies the concept of SBP by identifying its components to make it possible to differentiate it from other similar concepts. For proper training of SBP and to ensure these competencies in physicians, it is necessary to have an operational definition, and SBP’s components must be precisely defined in order to provide valid and reliable assessment tools. Methods Walker & Avant’s approach to concept analysis was performed in eight stages: choosing a concept, determining the purpose of analysis, identifying all uses of the concept, defining attributes, identifying a model case, identifying borderline, related, and contrary cases, identifying antecedents and consequences, and defining empirical referents. Results Based on the analysis undertaken, the attributes of SBP includes knowledge of the system, balanced decision between patients’ need and system goals, effective role playing in interprofessional health care team, system level of health advocacy, and acting for system improvement. System thinking and a functional system are antecedents and system goals are consequences. A case model, as well as border, and contrary cases of SBP, has been introduced. Conclusion he identification of SBP attributes in this study contributes to the body of knowledge in SBP and reduces the ambiguity of this concept to make it possible for applying it in training of different medical specialties. Also, it would be possible to develop and use more precise tools to evaluate SBP competency by using empirical referents of the analysis. PMID:27104198

  8. A collection of research reporting, theoretical analysis, and practical applications in science education: Examining qualitative research methods, action research, educator-researcher partnerships, and constructivist learning theory

    NASA Astrophysics Data System (ADS)

    Hartle, R. Todd

    2007-12-01

    Educator-researcher partnerships are increasingly being used to improve the teaching of science. Chapter 1 provides a summary of the literature concerning partnerships, and examines the justification of qualitative methods in studying these relationships. It also justifies the use of Participatory Action Research (PAR). Empirically-based studies of educator-researcher partnership relationships are rare despite investments in their implementation by the National Science Foundation (NSF) and others. Chapter 2 describes a qualitative research project in which participants in an NSF GK-12 fellowship program were studied using informal observations, focus groups, personal interviews, and journals to identify and characterize the cultural factors that influenced the relationships between the educators and researchers. These factors were organized into ten critical axes encompassing a range of attitudes, behaviors, or values defined by two stereotypical extremes. These axes were: (1) Task Dictates Context vs. Context Dictates Task; (2) Introspection vs. Extroversion; (3) Internal vs. External Source of Success; (4) Prior Planning vs. Implementation Flexibility; (5) Flexible vs. Rigid Time Sense; (6) Focused Time vs. Multi-tasking; (7) Specific Details vs. General Ideas; (8) Critical Feedback vs. Encouragement; (9) Short Procedural vs. Long Content Repetition; and (10) Methods vs. Outcomes are Well Defined. Another ten important stereotypical characteristics, which did not fit the structure of an axis, were identified and characterized. The educator stereotypes were: (1) Rapport/Empathy; (2) Like Kids; (3) People Management; (4) Communication Skills; and (5) Entertaining. The researcher stereotypes were: (1) Community Collaboration; (2) Focus Intensity; (3) Persistent; (4) Pattern Seekers; and (5) Curiosity/Skeptical. Chapter 3 summarizes the research presented in chapter 2 into a practical guide for participants and administrators of educator-researcher partnerships

  9. Science Teaching Methods: A Rationale for Practices

    ERIC Educational Resources Information Center

    Osborne, Jonathan

    2011-01-01

    This article is a version of the talk given by Jonathan Osborne as the Association for Science Education (ASE) invited lecturer at the National Science Teachers' Association Annual Convention in San Francisco, USA, in April 2011. The article provides an explanatory justification for teaching about the practices of science in school science that…

  10. Method of analysis at the U.S. Geological Survey California Water Science Center, Sacramento Laboratory - determination of haloacetic acid formation potential, method validation, and quality-control practices

    USGS Publications Warehouse

    Zazzi, Barbara C.; Crepeau, Kathryn L.; Fram, Miranda S.; Bergamaschi, Brian A.

    2005-01-01

    An analytical method for the determination of haloacetic acid formation potential of water samples has been developed by the U.S. Geological Survey California Water Science Center Sacramento Laboratory. The haloacetic acid formation potential is measured by dosing water samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine. The haloacetic acids formed are bromochloroacetic acid, bromodichloroacetic acid, dibromochloroacetic acid, dibromoacetic acid, dichloroacetic acid, monobromoacetic acid, monochloroacetic acid, tribromoacetic acid, and trichloroacetic acid. They are extracted, methylated, and then analyzed using a gas chromatograph equipped with an electron capture detector. Method validation experiments were performed to determine the method accuracy, precision, and detection limit for each of the compounds. Method detection limits for these nine haloacetic acids ranged from 0.11 to 0.45 microgram per liter. Quality-control practices include the use of blanks, quality-control samples, calibration verification standards, surrogate recovery, internal standard, matrix spikes, and duplicates.

  11. Practical methods to improve fee collections.

    PubMed

    Aluise, J J

    1990-01-01

    A medical practice's financial policies should be frankly discussed with patients and explained at the initial visit. At that time, health insurance coverage should also be discussed. This candid conversation not only opens the door to further communication regarding a once-sensitive subject but also facilitates collection of fees--especially important when high-priced procedures are contemplated. The author also discusses third-party payers (including Medicare) and the pros and cons of prepaid health plans.

  12. A new chemical diagnostic method for inborn errors of metabolism by mass spectrometry-rapid, practical, and simultaneous urinary metabolites analysis.

    PubMed

    Matsumoto, I; Kuhara, T

    1996-01-01

    In most developed countries, neonatal mass screening programs for the early diagnosis of inborn errors of metabolism (IEM) have been implemented and have been found to be effective for the prevention or significant reduction of clinical symptoms such as mental retardation. These programs rely primarily on simple bacterial inhibition assays (the "Guthrie tests"). We developed a new method for screening IEM using GC/MS, which enables accurate chemical diagnoses through urinary analyses with a simple practical procedure. The urine sample preparation for GC/MS takes one hour for one sample or three hours for a batch of 30 samples (will be fully automated shortly), and the following GC/MS measurement is completed within 15 min per sample. This method allows the simultaneous analyses of amino acids, organic acids, sugars, sugar alcohols, sugar acids, and nucleic acid bases. Therefore, a large number of metabolic disorders can be simultaneously tested by this chemical diagnostic procedure. This method is quite comprehensive and different from conventional GC/MS organic acidemia screening procedures, which are not well-suited to detect metabolic disorders except organic acidurias. Sample preparation includes urease treatment, deproteinization, and derivatization. The method has also been applied to neonate urine specimens that are absorbed into filter paper. The air-dried samples were mailed to the analytical laboratory and eluted with water. The eluate (0.1 mL) was incubated with urease, followed by deproteinization with alcohol, evaporation to dryness of the supernatant, and trimethylsilylation; the samples were applied to GC/MS. A pilot study of the application of this diagnostic procedure to the neonatal mass screening of 22 disorders was started in Japan on February 1, 1995 in cooperation with four medical institutes. This program is supported by the Japanese Society for Biomedical Mass Spectrometry and the Japanese Mass Screening Society. The initial twenty

  13. The 5-Step Method: Principles and Practice

    ERIC Educational Resources Information Center

    Copello, Alex; Templeton, Lorna; Orford, Jim; Velleman, Richard

    2010-01-01

    This article includes a description of the 5-Step Method. First, the origins and theoretical basis of the method are briefly described. This is followed by a discussion of the general principles that guide the delivery of the method. Each step is then described in more detail, including the content and focus of each of the five steps that include:…

  14. Method of analysis and quality-assurance practices by the U. S. Geological Survey Organic Geochemistry Research Group; determination of four selected mosquito insecticides and a synergist in water using liquid-liquid extraction and gas chrom

    USGS Publications Warehouse

    Zimmerman, L.R.; Strahan, A.P.; Thurman, E.M.

    2001-01-01

    A method of analysis and quality-assurance practices were developed for the determination of four mosquito insecticides (malathion, metho-prene, phenothrin, and resmethrin) and one synergist (piperonyl butoxide) in water. The analytical method uses liquid-liquid extraction (LLE) and gas chromatography/mass spectrometry (GC/MS). Good precision and accuracy were demonstrated in reagent water, urban surface water, and ground water. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 50 nanograms per liter ranged from 68 to 171 percent, with standard deviations in concentrations of 27 nanograms per liter or less. The method detection limit for all compounds was 5.9 nanograms per liter or less for 247-milliliter samples. This method is valuable for acquiring information about the fate and transport of these mosquito insecticides and one synergist in water.

  15. Scenistic Methods for Training: Applications and Practice

    ERIC Educational Resources Information Center

    Lyons, Paul R.

    2011-01-01

    Purpose: This paper aims to complement an earlier article (2010) in "Journal of European Industrial Training" in which the description and theory bases of scenistic methods were presented. This paper also offers a description of scenistic methods and information on theory bases. However, the main thrust of this paper is to describe, give suggested…

  16. The "Anchor" Method: Principle and Practice.

    ERIC Educational Resources Information Center

    Selgin, Paul

    This report discusses the "anchor" language learning method that is based upon derivation rather than construction, using Italian as an example of a language to be learned. This method borrows from the natural process of language learning as it asks the student to remember whole expressions that serve as vehicles for learning both words and rules,…

  17. A practical method for sensor absolute calibration.

    PubMed

    Meisenholder, G W

    1966-04-01

    This paper describes a method of performing sensor calibrations using an NBS standard of spectral irradiance. The method shown, among others, was used for calibration of the Mariner IV Canopus sensor. Agreement of inflight response to preflight calibrations performed by this technique has been found to be well within 10%. PMID:20048890

  18. Using Developmental Evaluation Methods with Communities of Practice

    ERIC Educational Resources Information Center

    van Winkelen, Christine

    2016-01-01

    Purpose: This paper aims to explore the use of developmental evaluation methods with community of practice programmes experiencing change or transition to better understand how to target support resources. Design/methodology/approach: The practical use of a number of developmental evaluation methods was explored in three organizations over a…

  19. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1929-01-01

    This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

  20. Exploratory and Confirmatory Analysis of the Trauma Practices Questionnaire

    ERIC Educational Resources Information Center

    Craig, Carlton D.; Sprang, Ginny

    2009-01-01

    Objective: The present study provides psychometric data for the Trauma Practices Questionnaire (TPQ). Method: A nationally randomized sample of 2,400 surveys was sent to self-identified trauma treatment specialists, and 711 (29.6%) were returned. Results: An exploratory factor analysis (N = 319) conducted on a randomly split sample (RSS) revealed…

  1. Mark Making: Methodologies and methods (innovative practice).

    PubMed

    Zeilig, Hannah

    2016-09-01

    Mark Making is a recently completed AHRC-funded review exploring the role of the participative arts for people with dementia in the UK. Key concerns underlying Mark Making were both how to privilege the views and feelings of people with a dementia and also how best to understand the value of the arts for people with a dementia. These issues were tackled using a variety of qualitative methods. Methods included a rigorous literature review, the development of a unique web-based map locating many participative arts projects and above all working with people with a dementia to ascertain their views. This brief article will concentrate on some of the innovative methods that the Mark Making team used, with particular reference to comics as a mode of engagement as used in the Descartes project. The article will provide an insight into some of the methodological challenges confronted by Mark Making as well as the inspirations and successes that were enjoyed.

  2. Practice-Focused Ethnographies of Higher Education: Method/ological Corollaries of a Social Practice Perspective

    ERIC Educational Resources Information Center

    Trowler, Paul Richard

    2014-01-01

    Social practice theory addresses both theoretical and method/ological agendas. To date priority has been given to the former, with writing on the latter tending often to be an afterthought to theoretical expositions or fieldwork accounts. This article gives sustained attention to the method/ological corollaries of a social practice perspective. It…

  3. Council on Certification Professional Practice Analysis.

    PubMed

    Zaglaniczny, K L

    1993-06-01

    The CCNA has completed a PPA and will begin implementing its recommendations with the December 1993 certification examination. The results of the PPA provide content validation for the CCNA certification examination. The certification examination is reflective of the knowledge and skill required for entry-level practice. Assessment of this knowledge is accomplished through the use of questions that are based on the areas represented in the content outline. Analysis of the PPA has resulted in changes in the examination content outline and percentages of questions in each area to reflect current entry-level nurse anesthesia practice. The new outline is based on the major domains of knowledge required for nurse anesthesia practice. These changes are justified by the consistency in the responses of the practitioners surveyed. There was overall agreement as to the knowledge and skills related to patient conditions, procedures, agents, techniques, and equipment that an entry-level CRNA must have to practice. Members of the CCNA and Examination Committee will use the revised outline to develop questions for the certification examination. The questions will be focused on the areas identified as requiring high levels of expertise and those that appeared higher in frequency. The PPA survey will be used as a basis for subsequent content validation studies. It will be revised to reflect new knowledge, technology, and techniques related to nurse anesthesia practice. The CCNA has demonstrated its commitment to the certification process through completion of the PPA and implementation of changes in the structure of the examination.

  4. ALFRED: A Practical Method for Alignment-Free Distance Computation.

    PubMed

    Thankachan, Sharma V; Chockalingam, Sriram P; Liu, Yongchao; Apostolico, Alberto; Aluru, Srinivas

    2016-06-01

    Alignment-free approaches are gaining persistent interest in many sequence analysis applications such as phylogenetic inference and metagenomic classification/clustering, especially for large-scale sequence datasets. Besides the widely used k-mer methods, the average common substring (ACS) approach has emerged to be one of the well-known alignment-free approaches. Two recent works further generalize this ACS approach by allowing a bounded number k of mismatches in the common substrings, relying on approximation (linear time) and exact computation, respectively. Albeit having a good worst-case time complexity [Formula: see text], the exact approach is complex and unlikely to be efficient in practice. Herein, we present ALFRED, an alignment-free distance computation method, which solves the generalized common substring search problem via exact computation. Compared to the theoretical approach, our algorithm is easier to implement and more practical to use, while still providing highly competitive theoretical performances with an expected run-time of [Formula: see text]. By applying our program to phylogenetic inference as a case study, we find that our program facilitates to exactly reconstruct the topology of the reference phylogenetic tree for a set of 27 primate mitochondrial genomes, at reasonably acceptable speed. ALFRED is implemented in C++ programming language and the source code is freely available online. PMID:27138275

  5. Airphoto analysis of erosion control practices

    NASA Technical Reports Server (NTRS)

    Morgan, K. M.; Morris-Jones, D. R.; Lee, G. B.; Kiefer, R. W.

    1980-01-01

    The Universal Soil Loss Equation (USLE) is a widely accepted tool for erosion prediction and conservation planning. In this study, airphoto analysis of color and color infrared 70 mm photography at a scale of 1:60,000 was used to determine the erosion control practice factor in the USLE. Information about contour tillage, contour strip cropping, and grass waterways was obtained from aerial photography for Pheasant Branch Creek watershed in Dane County, Wisconsin.

  6. Pragmatism in practice: mixed methods research for physiotherapy.

    PubMed

    Shaw, James A; Connelly, Denise M; Zecevic, Aleksandra A

    2010-11-01

    The purpose of this paper is to provide an argument for the place of mixed methods research across practice settings as an effective means of supporting evidence-based practice in physiotherapy. Physiotherapy practitioners use both qualitative and quantitative methods throughout the process of patient care-from history taking, assessment, and intervention to evaluation of outcomes. Research on practice paradigms demonstrates the importance of mixing qualitative and quantitative methods to achieve 'expert practice' that is concerned with optimizing outcomes and incorporating patient beliefs and values. Research paradigms that relate to this model of practice would integrate qualitative and quantitative types of knowledge and inquiry, while maintaining a prioritized focus on patient outcomes. Pragmatism is an emerging research paradigm where practical consequences and the effects of concepts and behaviors are vital components of meaning and truth. This research paradigm supports the simultaneous use of qualitative and quantitative methods of inquiry to generate evidence to support best practice. This paper demonstrates that mixed methods research with a pragmatist view provides evidence that embraces and addresses the multiple practice concerns of practitioners better than either qualitative or quantitative research approaches in isolation.

  7. Practical aspects of spatially high accurate methods

    NASA Technical Reports Server (NTRS)

    Godfrey, Andrew G.; Mitchell, Curtis R.; Walters, Robert W.

    1992-01-01

    The computational qualities of high order spatially accurate methods for the finite volume solution of the Euler equations are presented. Two dimensional essentially non-oscillatory (ENO), k-exact, and 'dimension by dimension' ENO reconstruction operators are discussed and compared in terms of reconstruction and solution accuracy, computational cost and oscillatory behavior in supersonic flows with shocks. Inherent steady state convergence difficulties are demonstrated for adaptive stencil algorithms. An exact solution to the heat equation is used to determine reconstruction error, and the computational intensity is reflected in operation counts. Standard MUSCL differencing is included for comparison. Numerical experiments presented include the Ringleb flow for numerical accuracy and a shock reflection problem. A vortex-shock interaction demonstrates the ability of the ENO scheme to excel in simulating unsteady high-frequency flow physics.

  8. Practice and Evaluation of Blended Learning with Cross-Cultural Distance Learning in a Foreign Language Class: Using Mix Methods Data Analysis

    ERIC Educational Resources Information Center

    Sugie, Satoko; Mitsugi, Makoto

    2014-01-01

    The Information and Communication Technology (ICT) utilization in Chinese as a "second" foreign language has mainly been focused on Learning Management System (LMS), digital material development, and quantitative analysis of learners' grammatical knowledge. There has been little research that has analyzed the effectiveness of…

  9. Practice-Near and Practice-Distant Methods in Human Services Research

    ERIC Educational Resources Information Center

    Froggett, Lynn; Briggs, Stephen

    2012-01-01

    This article discusses practice-near research in human services, a cluster of methodologies that may include thick description, intensive reflexivity, and the study of emotional and relational processes. Such methods aim to get as near as possible to experiences at the relational interface between institutions and the practice field.…

  10. Reflections on Experiential Teaching Methods: Linking the Classroom to Practice

    ERIC Educational Resources Information Center

    Wehbi, Samantha

    2011-01-01

    This article explores the use of experiential teaching methods in social work education. The literature demonstrates that relying on experiential teaching methods in the classroom can have overwhelmingly positive learning outcomes; however, not much is known about the possible effect of these classroom methods on practice. On the basis of…

  11. Communication Network Analysis Methods.

    ERIC Educational Resources Information Center

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  12. Development of a method to analyze orthopaedic practice expenses.

    PubMed

    Brinker, M R; Pierce, P; Siegel, G

    2000-03-01

    The purpose of the current investigation was to present a standard method by which an orthopaedic practice can analyze its practice expenses. To accomplish this, a five-step process was developed to analyze practice expenses using a modified version of activity-based costing. In this method, general ledger expenses were assigned to 17 activities that encompass all the tasks and processes typically performed in an orthopaedic practice. These 17 activities were identified in a practice expense study conducted for the American Academy of Orthopaedic Surgeons. To calculate the cost of each activity, financial data were used from a group of 19 orthopaedic surgeons in Houston, Texas. The activities that consumed the largest portion of the employee work force (person hours) were service patients in office (25.0% of all person hours), maintain medical records (13.6% of all person hours), and resolve collection disputes and rebill charges (12.3% of all person hours). The activities that comprised the largest portion of the total expenses were maintain facility (21.4%), service patients in office (16.0%), and sustain business by managing and coordinating practice (13.8%). The five-step process of analyzing practice expenses was relatively easy to perform and it may be used reliably by most orthopaedic practices. PMID:10738440

  13. Development of a method to analyze orthopaedic practice expenses.

    PubMed

    Brinker, M R; Pierce, P; Siegel, G

    2000-03-01

    The purpose of the current investigation was to present a standard method by which an orthopaedic practice can analyze its practice expenses. To accomplish this, a five-step process was developed to analyze practice expenses using a modified version of activity-based costing. In this method, general ledger expenses were assigned to 17 activities that encompass all the tasks and processes typically performed in an orthopaedic practice. These 17 activities were identified in a practice expense study conducted for the American Academy of Orthopaedic Surgeons. To calculate the cost of each activity, financial data were used from a group of 19 orthopaedic surgeons in Houston, Texas. The activities that consumed the largest portion of the employee work force (person hours) were service patients in office (25.0% of all person hours), maintain medical records (13.6% of all person hours), and resolve collection disputes and rebill charges (12.3% of all person hours). The activities that comprised the largest portion of the total expenses were maintain facility (21.4%), service patients in office (16.0%), and sustain business by managing and coordinating practice (13.8%). The five-step process of analyzing practice expenses was relatively easy to perform and it may be used reliably by most orthopaedic practices.

  14. Complementing Gender Analysis Methods.

    PubMed

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital. PMID:25941756

  15. Complementing Gender Analysis Methods.

    PubMed

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital.

  16. Methods of analysis and quality-assurance practices of the U.S. Geological Survey organic laboratory, Sacramento, California; determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Domagalski, Joseph L.; Kuivila, Kathryn M.

    1994-01-01

    Analytical method and quality-assurance practices were developed for a study of the fate and transport of pesticides in the Sacramento-San Joaquin Delta and the Sacramento and San Joaquin River. Water samples were filtered to remove suspended parti- culate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide, and the pesticides were eluted with three 2-milliliter aliquots of hexane:diethyl ether (1:1). The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for analytes determined per 1,500-milliliter samples ranged from 0.006 to 0.047 microgram per liter. Recoveries ranged from 47 to 89 percent for 12 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.05 and 0.26 microgram per liter. The method was modified to improve the pesticide recovery by reducing the sample volume to 1,000 milliliters. Internal standards were added to improve quantitative precision and accuracy. The analysis also was expanded to include a total of 21 pesticides. The method detection limits for 1,000-milliliter samples ranged from 0.022 to 0.129 microgram per liter. Recoveries ranged from 38 to 128 percent for 21 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.10 and 0.75 microgram per liter.

  17. Occupational health nursing 2004 practice analysis report.

    PubMed

    Strasser, Patricia B; Maher, Helen K; Knuth, Georgia; Fabrey, Lawrence J

    2006-01-01

    As a certifying body for occupational health nurses in the United States and Canada, the American Board for Occupational Health Nurses, Inc. (ABOHN) must ensure its certification examinations validly reflect current occupational health nurse practice. This report presents information from the ABOHN 2004 practice analysis. The study's primary purpose was to analyze areas of knowledge, skill, and ability for occupational health nurses as reflected by the tasks they perform to guide refinement of ABOHN's certification examinations. A valid and reliable survey instrument, containing demographic and job-related questions and 172 task statements was developed. A total of 5,586 surveys (4,921 Web-based and 665 paper) were made available to occupational health nurses throughout the United States and Canada. The usable response rate was 23.5% (N = 1,223). Decision rules were used to determine which survey tasks were appropriate for inclusion in Certified Occupational Health Nurse (COHN) and Certified Occupational Health Nurse Specialist (COHN-S) certification examination blueprints. The revised blueprints were used to develop new examinations. Study data also validated the existing ABOHN Case Management (CM) specialty examination blueprint, and verified occupational health nurse roles and responsibilities related to safety programs. Based on analysis of the safety-related items, ABOHN in collaboration with the Board of Certified Safety Professionals, has created a safety management credential (SM) and associated examination that certified occupational health nurses may use to verify their safety role proficiency.

  18. [Hydrogen (H2) exhalation tests--methods for general practice].

    PubMed

    Bornschein, W

    1988-04-01

    According to the literature as well as to own experience hydrogen breath tests seem to be suitable for a gastroenterologist's practice because of their practicability (this means non invasive and cheap methods) and their diagnostic relevance (sensitivity, specificity). Although hydrogen breath test with lactose is now the best way for diagnosis of lactose intolerance, hydrogen breath test with glucose as a mean of investigation of small bowel bacterial overgrowth still is subject to discussion. Lactulose hydrogen breath test in order to estimate small bowel transit time is of minor importance in gastroenterologist's practice. Yet at special questions it still may be of relevance (e.g. suspicion of functional diarrhea).

  19. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    PubMed

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module.

  20. Practical Issues in Component Aging Analysis

    SciTech Connect

    Dana L. Kelly; Andrei Rodionov; Jens Uwe-Klugel

    2008-09-01

    This paper examines practical issues in the statistical analysis of component aging data. These issues center on the stochastic process chosen to model component failures. The two stochastic processes examined are repair same as new, leading to a renewal process, and repair same as old, leading to a nonhomogeneous Poisson process. Under the first assumption, times between failures can treated as statistically independent observations from a stationary process. The common distribution of the times between failures is called the renewal distribution. Under the second process, the times between failures will not be independently and identically distributed, and one cannot simply fit a renewal distribution to the cumulative failure times or the times between failures. The paper illustrates how the assumption made regarding the repair process is crucial to the analysis. Besides the choice of stochastic process, other issues that are discussed include qualitative graphical analysis and simple nonparametric hypothesis tests to help judge which process appears more appropriate. Numerical examples are presented to illustrate the issues discussed in the paper.

  1. Farmers' Preferences for Methods of Receiving Information on New or Innovative Farming Practices.

    ERIC Educational Resources Information Center

    Riesenberg, Lou E.; Gor, Christopher Obel

    1989-01-01

    Survey of 386 Idaho farmers (response rate 58 percent) identified preferred methods of receiving information on new or innovative farming practices. Analysis revealed preference for interpersonal methods (demonstrations, tours, and field trips) over mass media such as computer-assisted instruction (CAI) and home study, although younger farmers,…

  2. Traditional Methods for Mineral Analysis

    NASA Astrophysics Data System (ADS)

    Ward, Robert E.; Carpenter, Charles E.

    This chapter describes traditional methods for analysis of minerals involving titrimetric and colorimetric procedures, and the use of ion selective electrodes. Other traditional methods of mineral analysis include gravimetric titration (i.e., insoluble forms of minerals are precipitated, rinse, dried, and weighed) and redox reactions (i.e., mineral is part of an oxidation-reduction reaction, and product is quantitated). However, these latter two methods will not be covered because they currently are used little in the food industry. The traditional methods that will be described have maintained widespread usage in the food industry despite the development of more modern instrumentation such as atomic absorption spectroscopy and inductively coupled plasma-atomic emission spectroscopy (Chap. 24). Traditional methods generally require chemicals and equipment that are routinely available in an analytical laboratory and are within the experience of most laboratory technicians. Additionally, traditional methods often form the basis for rapid analysis kits (e.g., Quantab®; for salt determination) that are increasingly in demand. Procedures for analysis of minerals of major nutritional or food processing concern are used for illustrative purposes. For additional examples of traditional methods refer to references (1-6). Slight modifications of these traditional methods are often needed for specific foodstuffs to minimize interferences or to be in the range of analytical performance. For analytical requirements for specific foods see the Official Methods of Analysis of AOAC International (5) and related official methods (6).

  3. Multi-criteria decision analysis: Limitations, pitfalls, and practical difficulties

    SciTech Connect

    Kujawski, Edouard

    2003-02-01

    The 2002 Winter Olympics women's figure skating competition is used as a case study to illustrate some of the limitations, pitfalls, and practical difficulties of Multi-Criteria Decision Analysis (MCDA). The paper compares several widely used models for synthesizing the multiple attributes into a single aggregate value. The various MCDA models can provide conflicting rankings of the alternatives for a common set of information even under states of certainty. Analysts involved in MCDA need to deal with the following challenging tasks: (1) selecting an appropriate analysis method, and (2) properly interpreting the results. An additional trap is the availability of software tools that implement specific MCDA models that can beguile the user with quantitative scores. These conclusions are independent of the decision domain and they should help foster better MCDA practices in many fields including systems engineering trade studies.

  4. Retrieval practice can eliminate list method directed forgetting.

    PubMed

    Abel, Magdalena; Bäuml, Karl-Heinz T

    2016-01-01

    It has recently been shown that retrieval practice can reduce memories' susceptibility to interference, like retroactive and proactive interference. In this study, we therefore examined whether retrieval practice can also reduce list method directed forgetting, a form of intentional forgetting that presupposes interference. In each of two experiments, subjects successively studied two lists of items. After studying each single list, subjects restudied the list items to enhance learning, or they were asked to recall the items. Following restudy or retrieval practice of list 1 items, subjects were cued to either forget the list or remember it for an upcoming final test. Experiment 1 employed a free-recall and Experiment 1 a cued-recall procedure on the final memory test. In both experiments, directed forgetting was present in the restudy condition but was absent in the retrieval-practice condition, indicating that retrieval practice can reduce or even eliminate this form of forgetting. The results are consistent with the view that retrieval practice enhances list segregation processes. Such processes may reduce interference between lists and thus reduce directed forgetting. PMID:26286882

  5. Retrieval practice can eliminate list method directed forgetting.

    PubMed

    Abel, Magdalena; Bäuml, Karl-Heinz T

    2016-01-01

    It has recently been shown that retrieval practice can reduce memories' susceptibility to interference, like retroactive and proactive interference. In this study, we therefore examined whether retrieval practice can also reduce list method directed forgetting, a form of intentional forgetting that presupposes interference. In each of two experiments, subjects successively studied two lists of items. After studying each single list, subjects restudied the list items to enhance learning, or they were asked to recall the items. Following restudy or retrieval practice of list 1 items, subjects were cued to either forget the list or remember it for an upcoming final test. Experiment 1 employed a free-recall and Experiment 1 a cued-recall procedure on the final memory test. In both experiments, directed forgetting was present in the restudy condition but was absent in the retrieval-practice condition, indicating that retrieval practice can reduce or even eliminate this form of forgetting. The results are consistent with the view that retrieval practice enhances list segregation processes. Such processes may reduce interference between lists and thus reduce directed forgetting.

  6. Practicing the practice: Learning to guide elementary science discussions in a practice-oriented science methods course

    NASA Astrophysics Data System (ADS)

    Shah, Ashima Mathur

    University methods courses are often criticized for telling pre-service teachers, or interns, about the theories behind teaching instead of preparing them to actually enact teaching. Shifting teacher education to be more "practice-oriented," or to focus more explicitly on the work of teaching, is a current trend for re-designing the way we prepare teachers. This dissertation addresses the current need for research that unpacks the shift to more practice-oriented approaches by studying the content and pedagogical approaches in a practice-oriented, masters-level elementary science methods course (n=42 interns). The course focused on preparing interns to guide science classroom discussions. Qualitative data, such as video records of course activities and interns' written reflections, were collected across eight course sessions. Codes were applied at the sentence and paragraph level and then grouped into themes. Five content themes were identified: foregrounding student ideas and questions, steering discussion toward intended learning goals, supporting students to do the cognitive work, enacting teacher role of facilitator, and creating a classroom culture for science discussions. Three pedagogical approach themes were identified. First, the teacher educators created images of science discussions by modeling and showing videos of this practice. They also provided focused teaching experiences by helping interns practice the interactive aspects of teaching both in the methods classroom and with smaller groups of elementary students in schools. Finally, they structured the planning and debriefing phases of teaching so interns could learn from their teaching experiences and prepare well for future experiences. The findings were analyzed through the lens of Grossman and colleagues' framework for teaching practice (2009) to reveal how the pedagogical approaches decomposed, represented, and approximated practice throughout course activities. Also, the teacher educators

  7. A Practical Method of Monitoring the Results of Health Care

    PubMed Central

    Daugharty, G. D.

    1979-01-01

    To meet our goal of improving health care through more productive use of the data we are collecting about the delivery of health care we need to define our concepts of health and quality. The WHO definition of health allows the design of useful functional outcome criteria which give us measurable standards for the outcome of the health care. By recording, retrieving, and reviewing pertinent information from the structure and the process of health care for a valid comparison with its outcome, the most effective and efficient health care is identified. A practical system is presented which identifies the better methods of management and produces the motivation for change that results in improved care. The successful use of this system in a private practice supports its universal adaptability for health care providers. The initial encouraging results suggest that future trials in other types of practices will be even more encouraging.

  8. Research in dental practice: a 'SWOT' analysis.

    PubMed

    Burke, F J T; Crisp, R J; McCord, J F

    2002-03-01

    Most dental treatment, in most countries, is carried out in general dental practice. There is therefore a potential wealth of research material, although clinical evaluations have generally been carried out on hospital-based patients. Many types of research, such as clinical evaluations and assessments of new materials, may be appropriate to dental practice. Principal problems are that dental practices are established to treat patients efficiently and to provide an income for the staff of the practice. Time spent on research therefore cannot be used for patient treatment, so there are cost implications. Critics of practice-based research have commented on the lack of calibration of operative diagnoses and other variables; however, this variability is the stuff of dental practice, the real-world situation. Many of the difficulties in carrying out research in dental practice may be overcome. For the enlightened, it may be possible to turn observations based on the volume of treatment carried out in practice into robust, clinically related and relevant research projects based in the real world of dental practice. PMID:11928346

  9. A practical method of estimating energy expenditure during tennis play.

    PubMed

    Novas, A M P; Rowbottom, D G; Jenkins, D G

    2003-03-01

    This study aimed to develop a practical method of estimating energy expenditure (EE) during tennis. Twenty-four elite female tennis players first completed a tennis-specific graded test in which five different Intensity levels were applied randomly. Each intensity level was intended to simulate a "game" of singles tennis and comprised six 14 s periods of activity alternated with 20 s of active rest. Oxygen consumption (VO2) and heart rate (HR) were measured continuously and each player's rate of perceived exertion (RPE) was recorded at the end of each intensity level. Rate of energy expenditure (EE(VO2)) during the test was calculated using the sum of VO2 during play and the 'O2 debt' during recovery, divided by the duration of the activity. There were significant individual linear relationships between EE(VO2) and RPE, EE(VO2) and HR (r > or = 0.89 & r > or = 0.93; p < 0.05). On a second occasion, six players completed a 60-min singles tennis match during which VO2, HR and RPE were recorded; EE(VO2) was compared with EE predicted from the previously derived RPE and HR regression equations. Analysis found that EE(VO2) was overestimated by EE(RPE) (92 +/- 76 kJ x h(-1)) and EE(HR) (435 +/- 678 kJ x h(-1)), but the error of estimation for EE(RPE) (t = -3.01; p = 0.03) was less than 5% whereas for EE(HR) such error was 20.7%. The results of the study show that RPE can be used to estimate the energetic cost of playing tennis.

  10. Genre Analysis, ESP and Professional Practice

    ERIC Educational Resources Information Center

    Bhatia, Vijay K.

    2008-01-01

    Studies of professional genres and professional practices are invariably seen as complementing each other, in that they not only influence each other but are often co-constructed in specific professional contexts. However, professional genres have often been analyzed in isolation, leaving the study of professional practice almost completely out,…

  11. Cost analysis can help a group practice increase revenues.

    PubMed

    Migliore, Sherry

    2002-02-01

    Undertaking a cost analysis to determine the cost of providing specific services can help group practices negotiate increased payment and identify areas for cost reduction. An OB/GYN practice in Pennsylvania undertook a cost analysis using the resource-based relative value system. Using data from the cost analysis, the practice was able to negotiate increased payment for some of its services. The practice also was able to target some of its fixed costs for reduction. Another result of the analysis was that the practice was able to focus marketing efforts on some of its most profitable, elective services, thereby increasing revenues. In addition, the practice was able to reduce the provision of unprofitable services.

  12. Practical methods to improve the development of computational software

    SciTech Connect

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-07-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  13. Methods for genetic linkage analysis using trisomies

    SciTech Connect

    Feingold, E.; Lamb, N.E.; Sherman, S.L.

    1995-02-01

    Certain genetic disorders are rare in the general population, but more common in individuals with specific trisomies. Examples of this include leukemia and duodenal atresia in trisomy 21. This paper presents a linkage analysis method for using trisomic individuals to map genes for such traits. It is based on a very general gene-specific dosage model that posits that the trait is caused by specific effects of different alleles at one or a few loci and that duplicate copies of {open_quotes}susceptibility{close_quotes} alleles inherited from the nondisjoining parent give increased likelihood of having the trait. Our mapping method is similar to identity-by-descent-based mapping methods using affected relative pairs and also to methods for mapping recessive traits using inbred individuals by looking for markers with greater than expected homozygosity by descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected homozygosity in the chromosomes inherited from the nondisjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the trait gene, a confidence interval for that distance, and methods for computing power and sample sizes. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers and how to test candidate genes. 20 refs., 5 figs., 1 tab.

  14. Between practice and theory: Melanie Klein, Anna Freud and the development of child analysis.

    PubMed

    Donaldson, G

    1996-04-01

    An examination of the early history of child analysis in the writings of Melanie Klein and Anna Freud reveals how two different and opposing approaches to child analysis arose at the same time. The two methods of child analysis are rooted in a differential emphasis on psychoanalytic theory and practice. The Kleinian method derives from the application of technique while the Anna Freudian method is driven by theory. Furthermore, by holding to the Freudian theory of child development Anna Freud was forced to limit the scope of child analysis, while Klein's application of Freudian practice has led to new discoveries about the development of the infant psyche.

  15. Between practice and theory: Melanie Klein, Anna Freud and the development of child analysis.

    PubMed

    Donaldson, G

    1996-04-01

    An examination of the early history of child analysis in the writings of Melanie Klein and Anna Freud reveals how two different and opposing approaches to child analysis arose at the same time. The two methods of child analysis are rooted in a differential emphasis on psychoanalytic theory and practice. The Kleinian method derives from the application of technique while the Anna Freudian method is driven by theory. Furthermore, by holding to the Freudian theory of child development Anna Freud was forced to limit the scope of child analysis, while Klein's application of Freudian practice has led to new discoveries about the development of the infant psyche. PMID:8642183

  16. Method of multivariate spectral analysis

    DOEpatents

    Keenan, Michael R.; Kotula, Paul G.

    2004-01-06

    A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).

  17. A systematic approach to initial data analysis is good research practice.

    PubMed

    Huebner, Marianne; Vach, Werner; le Cessie, Saskia

    2016-01-01

    Initial data analysis is conducted independently of the analysis needed to address the research questions. Shortcomings in these first steps may result in inappropriate statistical methods or incorrect conclusions. We outline a framework for initial data analysis and illustrate the impact of initial data analysis on research studies. Examples of reporting of initial data analysis in publications are given. A systematic and careful approach to initial data analysis is needed as good research practice.

  18. A practical method to evaluate radiofrequency exposure of mast workers.

    PubMed

    Alanko, Tommi; Hietanen, Maila

    2008-01-01

    Assessment of occupational exposure to radiofrequency (RF) fields in telecommunication transmitter masts is a challenging task. For conventional field strength measurements using manually operated instruments, it is difficult to document the locations of measurements while climbing up a mast. Logging RF dosemeters worn by the workers, on the other hand, do not give any information about the location of the exposure. In this study, a practical method was developed and applied to assess mast workers' exposure to RF fields and the corresponding location. This method uses a logging dosemeter for personal RF exposure evaluation and two logging barometers to determine the corresponding height of the worker's position on the mast. The procedure is not intended to be used for compliance assessments, but to indicate locations where stricter assessments are needed. The applicability of the method is demonstrated by making measurements in a TV and radio transmitting mast.

  19. A practical method to evaluate radiofrequency exposure of mast workers.

    PubMed

    Alanko, Tommi; Hietanen, Maila

    2008-01-01

    Assessment of occupational exposure to radiofrequency (RF) fields in telecommunication transmitter masts is a challenging task. For conventional field strength measurements using manually operated instruments, it is difficult to document the locations of measurements while climbing up a mast. Logging RF dosemeters worn by the workers, on the other hand, do not give any information about the location of the exposure. In this study, a practical method was developed and applied to assess mast workers' exposure to RF fields and the corresponding location. This method uses a logging dosemeter for personal RF exposure evaluation and two logging barometers to determine the corresponding height of the worker's position on the mast. The procedure is not intended to be used for compliance assessments, but to indicate locations where stricter assessments are needed. The applicability of the method is demonstrated by making measurements in a TV and radio transmitting mast. PMID:19054796

  20. Method of photon spectral analysis

    DOEpatents

    Gehrke, Robert J.; Putnam, Marie H.; Killian, E. Wayne; Helmer, Richard G.; Kynaston, Ronnie L.; Goodwin, Scott G.; Johnson, Larry O.

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  1. Method of photon spectral analysis

    DOEpatents

    Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

    1993-04-27

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

  2. Practical Nursing. Ohio's Competency Analysis Profile.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Vocational Instructional Materials Lab.

    Developed through a modified DACUM (Developing a Curriculum) process involving business, industry, labor, and community agency representatives in Ohio, this document is a comprehensive and verified employer competency profile for practical nursing. The list contains units (with and without subunits), competencies, and competency builders that…

  3. Development of a practical costing method for hospitals.

    PubMed

    Cao, Pengyu; Toyabe, Shin-Ichi; Akazawa, Kouhei

    2006-03-01

    To realize an effective cost control, a practical and accurate cost accounting system is indispensable in hospitals. In traditional cost accounting systems, the volume-based costing (VBC) is the most popular cost accounting method. In this method, the indirect costs are allocated to each cost object (services or units of a hospital) using a single indicator named a cost driver (e.g., Labor hours, revenues or the number of patients). However, this method often results in rough and inaccurate results. The activity based costing (ABC) method introduced in the mid 1990s can prove more accurate results. With the ABC method, all events or transactions that cause costs are recognized as "activities", and a specific cost driver is prepared for each activity. Finally, the costs of activities are allocated to cost objects by the corresponding cost driver. However, it is much more complex and costly than other traditional cost accounting methods because the data collection for cost drivers is not always easy. In this study, we developed a simplified ABC (S-ABC) costing method to reduce the workload of ABC costing by reducing the number of cost drivers used in the ABC method. Using the S-ABC method, we estimated the cost of the laboratory tests, and as a result, similarly accurate results were obtained with the ABC method (largest difference was 2.64%). Simultaneously, this new method reduces the seven cost drivers used in the ABC method to four. Moreover, we performed an evaluation using other sample data from physiological laboratory department to certify the effectiveness of this new method. In conclusion, the S-ABC method provides two advantages in comparison to the VBC and ABC methods: (1) it can obtain accurate results, and (2) it is simpler to perform. Once we reduce the number of cost drivers by applying the proposed S-ABC method to the data for the ABC method, we can easily perform the cost accounting using few cost drivers after the second round of costing.

  4. Development of a practical costing method for hospitals.

    PubMed

    Cao, Pengyu; Toyabe, Shin-Ichi; Akazawa, Kouhei

    2006-03-01

    To realize an effective cost control, a practical and accurate cost accounting system is indispensable in hospitals. In traditional cost accounting systems, the volume-based costing (VBC) is the most popular cost accounting method. In this method, the indirect costs are allocated to each cost object (services or units of a hospital) using a single indicator named a cost driver (e.g., Labor hours, revenues or the number of patients). However, this method often results in rough and inaccurate results. The activity based costing (ABC) method introduced in the mid 1990s can prove more accurate results. With the ABC method, all events or transactions that cause costs are recognized as "activities", and a specific cost driver is prepared for each activity. Finally, the costs of activities are allocated to cost objects by the corresponding cost driver. However, it is much more complex and costly than other traditional cost accounting methods because the data collection for cost drivers is not always easy. In this study, we developed a simplified ABC (S-ABC) costing method to reduce the workload of ABC costing by reducing the number of cost drivers used in the ABC method. Using the S-ABC method, we estimated the cost of the laboratory tests, and as a result, similarly accurate results were obtained with the ABC method (largest difference was 2.64%). Simultaneously, this new method reduces the seven cost drivers used in the ABC method to four. Moreover, we performed an evaluation using other sample data from physiological laboratory department to certify the effectiveness of this new method. In conclusion, the S-ABC method provides two advantages in comparison to the VBC and ABC methods: (1) it can obtain accurate results, and (2) it is simpler to perform. Once we reduce the number of cost drivers by applying the proposed S-ABC method to the data for the ABC method, we can easily perform the cost accounting using few cost drivers after the second round of costing. PMID

  5. Measurement practices: methods for developing content-valid student examinations.

    PubMed

    Bridge, Patrick D; Musial, Joseph; Frank, Robert; Roe, Thomas; Sawilowsky, Shlomo

    2003-07-01

    Measurement experts generally agree that a systematic approach to test construction will probably result in an instrument with sound psychometric properties. One fundamental method is called the blueprint approach to test construction. A test blueprint is a tool used in the process for generating content-valid exams by linking the subject matter delivered during instruction and the items appearing on the test. Unfortunately, this procedure as well as other educational measurement practices is often overlooked A survey of curriculum administrators at 144 United States and international medical schools was conducted to assess the importance and prevalence of test blueprinting in their school. Although most found test blueprinting to be very important, few require the practice. The purpose of this paper is to review the fundamental principals associated with achieving a high level of content validity when developing tests for students. The short-term efforts necessary to develop and integrate measurement theory into practice will lead to long-term gains for students, faculty and academic institutions.

  6. Methods for genetic linkage analysis using trisomies

    SciTech Connect

    Feingold, E.; Lamb, N.E.; Sherman, S.L.

    1994-09-01

    Certain genetic disorders (e.g. congenital cataracts, duodenal atresia) are rare in the general population, but more common in people with Down`s syndrome. We present a method for using individuals with trisomy 21 to map genes for such traits. Our methods are analogous to methods for mapping autosomal dominant traits using affected relative pairs by looking for markers with greater than expected identity-by-descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected reduction to homozygosity in the chromosomes inherited form the non-disjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the gene, a confidence interval for that distance, and methods for computing power and sample sizes. The methods are described in the context of gene-dosage model for the etiology of the disorder, but can be extended to other models. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers, how to test candidate genes, and how to handle the effect of reduced recombination associated with maternal meiosis I non-disjunction.

  7. Landscape analysis: Theoretical considerations and practical needs

    USGS Publications Warehouse

    Godfrey, A.E.; Cleaves, E.T.

    1991-01-01

    Numerous systems of land classification have been proposed. Most have led directly to or have been driven by an author's philosophy of earth-forming processes. However, the practical need of classifying land for planning and management purposes requires that a system lead to predictions of the results of management activities. We propose a landscape classification system composed of 11 units, from realm (a continental mass) to feature (a splash impression). The classification concerns physical aspects rather than economic or social factors; and aims to merge land inventory with dynamic processes. Landscape units are organized using a hierarchical system so that information may be assembled and communicated at different levels of scale and abstraction. Our classification uses a geomorphic systems approach that emphasizes the geologic-geomorphic attributes of the units. Realm, major division, province, and section are formulated by subdividing large units into smaller ones. For the larger units we have followed Fenneman's delineations, which are well established in the North American literature. Areas and districts are aggregated into regions and regions into sections. Units smaller than areas have, in practice, been subdivided into zones and smaller units if required. We developed the theoretical framework embodied in this classification from practical applications aimed at land use planning and land management in Maryland (eastern Piedmont Province near Baltimore) and Utah (eastern Uinta Mountains). ?? 1991 Springer-Verlag New York Inc.

  8. A Practical Introduction to Analysis and Synthesis

    ERIC Educational Resources Information Center

    Williams, R. D.; Cosart, W. P.

    1976-01-01

    Discusses an introductory chemical engineering course in which mathematical models are used to analyze experimental data. Concepts illustrated include dimensional analysis, scaleup, heat transfer, and energy conservation. (MLH)

  9. Contingency Analysis: Toward a Unified Theory for Social Work Practice.

    ERIC Educational Resources Information Center

    Thyer, Bruce A.

    1987-01-01

    Draws from the empirical foundations of operant psychology to propose a unified theory, known as contingency analysis, for social work practice. Discusses the four propositions on which the theory is based that account for human behavior at all levels of social work practice. Shows that this approach has great utility for the profession.…

  10. Situational Analysis: A Framework for Evidence-Based Practice

    ERIC Educational Resources Information Center

    Annan, Jean

    2005-01-01

    Situational analysis is a framework for professional practice and research in educational psychology. The process is guided by a set of practice principles requiring that psychologists' work is evidence-based, ecological, collaborative and constructive. The framework is designed to provide direction for psychologists who wish to tailor their…

  11. Voltammetric analysis apparatus and method

    DOEpatents

    Almon, A.C.

    1993-06-08

    An apparatus and method is described for electrochemical analysis of elements in solution. An auxiliary electrode, a reference electrode, and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and the auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between the auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in the sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  12. Voltametric analysis apparatus and method

    DOEpatents

    Almon, Amy C.

    1993-01-01

    An apparatus and method for electrochemical analysis of elements in solution. An auxiliary electrode 14, a reference electrode 18, and five working electrodes 20, 22, 26, 28, and 30 are positioned in a container 12 containing a sample solution 34. The working electrodes are spaced apart evenly from each other and auxiliary electrode 14 to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between auxiliary electrode 14 and each of the working electrodes 20, 22, 26, 28, and 30. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in sample solution 34 and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  13. Correlation method of electrocardiogram analysis

    NASA Astrophysics Data System (ADS)

    Strinadko, Marina M.; Timochko, Katerina B.

    2002-02-01

    The electrocardiograph method is the informational source for functional heart state characteristics. The electrocardiogram parameters are the integrated map of many component characteristics of the heart system and depend on disturbance requirements of each device. In the research work the attempt of making the skeleton diagram of perturbation of the heart system is made by the characteristic description of its basic components and connections between them through transition functions, which are written down by the differential equations of the first and second order with the purpose to build-up and analyze electrocardiogram. Noting the vector character of perturbation and the various position of heart in each organism, we offer own coordinate system connected with heart. The comparative analysis of electrocardiogram was conducted with the usage of correlation method.

  14. Comparison of four teaching methods on Evidence-based Practice skills of postgraduate nursing students.

    PubMed

    Fernandez, Ritin S; Tran, Duong Thuy; Ramjan, Lucie; Ho, Carey; Gill, Betty

    2014-01-01

    The aim of this study was to compare four teaching methods on the evidence-based practice knowledge and skills of postgraduate nursing students. Students enrolled in the Evidence-based Nursing (EBN) unit in Australia and Hong Kong in 2010 and 2011 received education via either the standard distance teaching method, computer laboratory teaching method, Evidence-based Practice-Digital Video Disc (EBP-DVD) teaching method or the didactic classroom teaching method. Evidence-based Practice (EBP) knowledge and skills were evaluated using student assignments that comprised validated instruments. One-way analysis of covariance was implemented to assess group differences on outcomes after controlling for the effects of age and grade point average (GPA). Data were obtained from 187 students. The crude mean score among students receiving the standard+DVD method of instruction was higher for developing a precise clinical question (8.1±0.8) and identifying the level of evidence (4.6±0.7) compared to those receiving other teaching methods. These differences were statistically significant after controlling for age and grade point average. Significant improvement in cognitive and technical EBP skills can be achieved for postgraduate nursing students by integrating a DVD as part of the EBP teaching resources. The EBP-DVD is an easy teaching method to improve student learning outcomes and ensure that external students receive equivalent and quality learning experiences.

  15. Comparison of four teaching methods on Evidence-based Practice skills of postgraduate nursing students.

    PubMed

    Fernandez, Ritin S; Tran, Duong Thuy; Ramjan, Lucie; Ho, Carey; Gill, Betty

    2014-01-01

    The aim of this study was to compare four teaching methods on the evidence-based practice knowledge and skills of postgraduate nursing students. Students enrolled in the Evidence-based Nursing (EBN) unit in Australia and Hong Kong in 2010 and 2011 received education via either the standard distance teaching method, computer laboratory teaching method, Evidence-based Practice-Digital Video Disc (EBP-DVD) teaching method or the didactic classroom teaching method. Evidence-based Practice (EBP) knowledge and skills were evaluated using student assignments that comprised validated instruments. One-way analysis of covariance was implemented to assess group differences on outcomes after controlling for the effects of age and grade point average (GPA). Data were obtained from 187 students. The crude mean score among students receiving the standard+DVD method of instruction was higher for developing a precise clinical question (8.1±0.8) and identifying the level of evidence (4.6±0.7) compared to those receiving other teaching methods. These differences were statistically significant after controlling for age and grade point average. Significant improvement in cognitive and technical EBP skills can be achieved for postgraduate nursing students by integrating a DVD as part of the EBP teaching resources. The EBP-DVD is an easy teaching method to improve student learning outcomes and ensure that external students receive equivalent and quality learning experiences. PMID:23107585

  16. Sequential multiple methods as a contemporary method in learning disability nursing practice research.

    PubMed

    Mafuba, Kay; Gates, Bob

    2012-12-01

    This paper explores and advocates the use of sequential multiple methods as a contemporary strategy for undertaking research. Sequential multiple methods involve the use of results obtained through one data collection method to determine the direction and implementation of subsequent stages of a research project (Morse, 1991; Morgan, 1998). This paper will also explore the significance of how triangulating research at the epistemological, theoretical and methodological levels could enhance research. Finally the paper evaluates the significance of sequential multiple method in learning disability nursing research practice.

  17. SAR/QSAR methods in public health practice

    SciTech Connect

    Demchuk, Eugene Ruiz, Patricia; Chou, Selene; Fowler, Bruce A.

    2011-07-15

    Methods of (Quantitative) Structure-Activity Relationship ((Q)SAR) modeling play an important and active role in ATSDR programs in support of the Agency mission to protect human populations from exposure to environmental contaminants. They are used for cross-chemical extrapolation to complement the traditional toxicological approach when chemical-specific information is unavailable. SAR and QSAR methods are used to investigate adverse health effects and exposure levels, bioavailability, and pharmacokinetic properties of hazardous chemical compounds. They are applied as a part of an integrated systematic approach in the development of Health Guidance Values (HGVs), such as ATSDR Minimal Risk Levels, which are used to protect populations exposed to toxic chemicals at hazardous waste sites. (Q)SAR analyses are incorporated into ATSDR documents (such as the toxicological profiles and chemical-specific health consultations) to support environmental health assessments, prioritization of environmental chemical hazards, and to improve study design, when filling the priority data needs (PDNs) as mandated by Congress, in instances when experimental information is insufficient. These cases are illustrated by several examples, which explain how ATSDR applies (Q)SAR methods in public health practice.

  18. A Practice-Based Analysis of an Online Strategy Game

    NASA Astrophysics Data System (ADS)

    Milolidakis, Giannis; Kimble, Chris; Akoumianakis, Demosthenes

    In this paper, we will analyze a massively multiplayer online game in an attempt to identify the elements of practice that enable social interaction and cooperation within the game’s virtual world. Communities of Practice and Activity Theory offer the theoretical lens for identifying and understanding what constitutes practice within the community and how such practice is manifest and transmitted during game play. Our analysis suggests that in contrast to prevalent perceptions of practice as being textually mediated, in virtual settings it is framed as much in social interactions as in processes, artifacts and the tools constituting the ‘linguistic’ domain of the game or the practice the gaming community is about.

  19. Comparative Lifecycle Energy Analysis: Theory and Practice.

    ERIC Educational Resources Information Center

    Morris, Jeffrey; Canzoneri, Diana

    1992-01-01

    Explores the position that more energy is conserved through recycling secondary materials than is generated from municipal solid waste incineration. Discusses one component of a lifecycle analysis--a comparison of energy requirements for manufacturing competing products. Includes methodological issues, energy cost estimates, and difficulties…

  20. A practical approach for linearity assessment of calibration curves under the International Union of Pure and Applied Chemistry (IUPAC) guidelines for an in-house validation of method of analysis.

    PubMed

    Sanagi, M Marsin; Nasir, Zalilah; Ling, Susie Lu; Hermawan, Dadan; Ibrahim, Wan Aini Wan; Naim, Ahmedy Abu

    2010-01-01

    Linearity assessment as required in method validation has always been subject to different interpretations and definitions by various guidelines and protocols. However, there are very limited applicable implementation procedures that can be followed by a laboratory chemist in assessing linearity. Thus, this work proposes a simple method for linearity assessment in method validation by a regression analysis that covers experimental design, estimation of the parameters, outlier treatment, and evaluation of the assumptions according to the International Union of Pure and Applied Chemistry guidelines. The suitability of this procedure was demonstrated by its application to an in-house validation for the determination of plasticizers in plastic food packaging by GC. PMID:20922968

  1. Tetrad Analysis: A Practical Demonstration Using Simple Models.

    ERIC Educational Resources Information Center

    Gow, Mary M.; Nicholl, Desmond S. T.

    1988-01-01

    Uses simple models to illustrate the principles of this genetic method of mapping gene loci. Stresses that this system enables a practical approach to be used with students who experience difficulty in understanding the concepts involved. (CW)

  2. Progress testing: critical analysis and suggested practices.

    PubMed

    Albanese, Mark; Case, Susan M

    2016-03-01

    Educators have long lamented the tendency of students to engage in rote memorization in preparation for tests rather than engaging in deep learning where they attempt to gain meaning from their studies. Rote memorization driven by objective exams has been termed a steering effect. Progress testing (PT), in which a comprehensive examination sampling all of medicine is administered repeatedly throughout the entire curriculum, was developed with the stated aim of breaking the steering effect of examinations and of promoting deep learning. PT is an approach historically linked to problem-based learning (PBL) although there is a growing recognition of its applicability more broadly. The purpose of this article is to summarize the salient features of PT drawn from the literature, provide a critical review of these features based upon the same literature and psychometric considerations drawn from the Standards for Educational and Psychological Testing and provide considerations of what should be part of best practices in applying PT from an evidence-based and a psychometric perspective. PMID:25662873

  3. Practical evaluation of Mung bean seed pasteurization method in Japan.

    PubMed

    Bari, M L; Enomoto, K; Nei, D; Kawamoto, S

    2010-04-01

    The majority of the seed sprout-related outbreaks have been associated with Escherichia coli O157:H7 and Salmonella. Therefore, an effective method for inactivating these organisms on the seeds before sprouting is needed. The current pasteurization method for mung beans in Japan (hot water treatment at 85 degrees C for 10 s) was more effective for disinfecting inoculated E. coli O157:H7, Salmonella, and nonpathogenic E. coli on mung bean seeds than was the calcium hypochlorite treatment (20,000 ppm for 20 min) recommended by the U.S. Food and Drug Administration. Hot water treatment at 85 degrees C for 40 s followed by dipping in cold water for 30 s and soaking in chlorine water (2,000 ppm) for 2 h reduced the pathogens to undetectable levels, and no viable pathogens were found in a 25-g enrichment culture and during the sprouting process. Practical tests using a working pasteurization machine with nonpathogenic E. coli as a surrogate produced similar results. The harvest yield of the treated seed was within the acceptable range. These treatments could be a viable alternative to the presently recommended 20,000-ppm chlorine treatment for mung bean seeds.

  4. Flow analysis system and method

    NASA Technical Reports Server (NTRS)

    Hill, Wayne S. (Inventor); Barck, Bruce N. (Inventor)

    1998-01-01

    A non-invasive flow analysis system and method wherein a sensor, such as an acoustic sensor, is coupled to a conduit for transmitting a signal which varies depending on the characteristics of the flow in the conduit. The signal is amplified and there is a filter, responsive to the sensor signal, and tuned to pass a narrow band of frequencies proximate the resonant frequency of the sensor. A demodulator generates an amplitude envelope of the filtered signal and a number of flow indicator quantities are calculated based on variations in amplitude of the amplitude envelope. A neural network, or its equivalent, is then used to determine the flow rate of the flow in the conduit based on the flow indicator quantities.

  5. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  6. Diagnostic Methods for Bile Acid Malabsorption in Clinical Practice

    PubMed Central

    Vijayvargiya, Priya; Camilleri, Michael; Shin, Andrea; Saenger, Amy

    2013-01-01

    Altered bile acid (BA) concentrations in the colon may cause diarrhea or constipation. BA malabsorption (BAM) accounts for >25% of patients with irritable bowel syndrome (IBS) with diarrhea and chronic diarrhea in Western countries. As BAM is increasingly recognized, proper diagnostic methods are desired in clinical practice to help direct the most effective treatment course for the chronic bowel dysfunction. This review appraises the methodology, advantages and disadvantages of 4 tools that directly measure BAM: 14C-glycocholate breath and stool test, 75Selenium HomotauroCholic Acid Test (SeHCAT), 7 α-hydroxy-4-cholesten-3-one (C4) and fecal BAs. 14C-glycocholate is a laborious test no longer widely utilized. 75SeHCAT is validated, but not available in the United States. Serum C4 is a simple, accurate method that is applicable to a majority of patients, but requires further clinical validation. Fecal measurements to quantify total and individual fecal BAs are technically cumbersome and not widely available. Regrettably, none of these tests are routinely available in the U.S., and a therapeutic trial with a BA binder is used as a surrogate for diagnosis of BAM. Recent data suggest there is an advantage to studying fecal excretion of the individual BAs and their role in BAM; this may constitute a significant advantage of the fecal BA method over the other tests. Fecal BA test could become a routine addition to fecal fat measurement in patients with unexplained diarrhea. In summary, availability determines the choice of test among C4, SeHCAT and fecal BA; more widespread availability of such tests would enhance clinical management of these patients. PMID:23644387

  7. Practical methods for meeting remediation goals at hazardous waste sites.

    PubMed

    Schulz, T W; Griffin, S

    2001-02-01

    Risk-based cleanup goals or preliminary remediation goals (PRGs) are established at hazardous waste sites when contaminant concentrations in air, soil, surface water, or groundwater exceed specified acceptable risk levels. When derived in accordance with the Environmental Protection Agency's risk assessment guidance, the PRG is intended to represent the average contaminant concentration within an exposure unit area that is left on the site following remediation. The PRG, however, frequently has been used inconsistently at Superfund sites with a number of remediation decisions using the PRG as a not-to-exceed concentration (NTEC). Such misapplications could result in overly conservative and unnecessarily costly remedial actions. The PRG should be applied in remedial actions in the same manner in which it was generated. Statistical methods, such as Bower's Confidence Response Goal, and mathematical methods such as "iterative removal of hot spots," are available to assist in the development of NTECs that ensure the average postremediation contaminant concentration is at or below the PRG. These NTECs can provide the risk manager with a more practical cleanup goal. In addition, an acute PRG can be developed to ensure that contaminant concentrations left on-site following remediation are not so high as to pose an acute or short-term health risk if excessive exposure to small areas of the site should occur. A case study demonstrates cost savings of five to ten times associated with the more scientifically sound use of the PRG as a postremediation site average, and development of a separate NTEC and acute PRG based on the methods referenced in this article.

  8. Analysis Methods of Magnesium Chips

    NASA Astrophysics Data System (ADS)

    Ohmann, Sven; Ditze, André; Scharf, Christiane

    2015-11-01

    The quality of recycled magnesium from chips depends strongly on their exposure to inorganic and organic impurities that are added during the production processes. Different kinds of magnesium chips from these processes were analyzed by several methods. In addition, the accuracy and effectiveness of the methods are discussed. The results show that the chips belong either to the AZ91, AZ31, AM50/60, or AJ62 alloy. Some kinds of chips show deviations from the above-mentioned normations. Different impurities result mainly from transition metals and lime. The water and oil content does not exceed 25%, and the chip size is not more than 4 mm in the diameter. The sieve analysis shows good results for oily and wet chips. The determination of oil and water shows better results for the application of a Soxhlet compared with the addition of lime and vacuum distillation. The most accurate values for the determination of water and oil are obtained by drying at 110°C (for water) and washing with acetone (for oil) by hand.

  9. Extended morphological processing: a practical method for automatic spot detection of biological markers from microscopic images

    PubMed Central

    2010-01-01

    Background A reliable extraction technique for resolving multiple spots in light or electron microscopic images is essential in investigations of the spatial distribution and dynamics of specific proteins inside cells and tissues. Currently, automatic spot extraction and characterization in complex microscopic images poses many challenges to conventional image processing methods. Results A new method to extract closely located, small target spots from biological images is proposed. This method starts with a simple but practical operation based on the extended morphological top-hat transformation to subtract an uneven background. The core of our novel approach is the following: first, the original image is rotated in an arbitrary direction and each rotated image is opened with a single straight line-segment structuring element. Second, the opened images are unified and then subtracted from the original image. To evaluate these procedures, model images of simulated spots with closely located targets were created and the efficacy of our method was compared to that of conventional morphological filtering methods. The results showed the better performance of our method. The spots of real microscope images can be quantified to confirm that the method is applicable in a given practice. Conclusions Our method achieved effective spot extraction under various image conditions, including aggregated target spots, poor signal-to-noise ratio, and large variations in the background intensity. Furthermore, it has no restrictions with respect to the shape of the extracted spots. The features of our method allow its broad application in biological and biomedical image information analysis. PMID:20615231

  10. Practical Framework: Implementing OEE Method in Manufacturing Process Environment

    NASA Astrophysics Data System (ADS)

    Maideen, N. C.; Sahudin, S.; Mohd Yahya, N. H.; Norliawati, A. O.

    2016-02-01

    Manufacturing process environment requires reliable machineries in order to be able to satisfy the market demand. Ideally, a reliable machine is expected to be operated and produce a quality product at its maximum designed capability. However, due to some reason, the machine usually unable to achieved the desired performance. Since the performance will affect the productivity of the system, a measurement technique should be applied. Overall Equipment Effectiveness (OEE) is a good method to measure the performance of the machine. The reliable result produced from OEE can then be used to propose a suitable corrective action. There are a lot of published paper mentioned about the purpose and benefit of OEE that covers what and why factors. However, the how factor not yet been revealed especially the implementation of OEE in manufacturing process environment. Thus, this paper presents a practical framework to implement OEE and a case study has been discussed to explain in detail each steps proposed. The proposed framework is beneficial to the engineer especially the beginner to start measure their machine performance and later improve the performance of the machine.

  11. A practical method of chronic ethanol administration in mice.

    PubMed

    Coleman, Ruth A; Young, Betty M; Turner, Lucas E; Cook, Robert T

    2008-01-01

    Mice provide a useful model for the study of immune deficiency caused by chronic alcohol abuse. Their suitability is related to several factors, including in particular the extensive knowledge base in the immunology of mice already existing in the literature. Specific modeling of the immunodeficiency of the chronic human alcoholic requires that ethanol must be administered to the model for a significant portion of its life span. In mice, it has proven to be necessary to administer ethanol daily for up to 32 wk or longer to observe all the immune abnormalities that occur in middle-aged alcoholic humans. Such time spans are problematic with many of the common protocols for ethanol administration. It has been shown by others and confirmed by our group that the most practical way of accomplishing such long protocols is by administering ethanol in water as the only choice of water. Details of management of the chronic ethanol mouse colony are described here that are necessary for the success of such studies, including methods for initiating ethanol administration, maintenance of barrier protection, monitoring weight gain, strain differences and fetal alcohol exposure.

  12. Capital investment analysis: three methods.

    PubMed

    Gapenski, L C

    1993-08-01

    Three cash flow/discount rate methods can be used when conducting capital budgeting financial analyses: the net operating cash flow method, the net cash flow to investors method, and the net cash flow to equity holders method. The three methods differ in how the financing mix and the benefits of debt financing are incorporated. This article explains the three methods, demonstrates that they are essentially equivalent, and recommends which method to use under specific circumstances.

  13. Practical limitations of the slope assisted BOTDA method in dynamic strain sensing

    NASA Astrophysics Data System (ADS)

    Minardo, A.; Catalano, E.; Zeni, L.

    2016-05-01

    By analyzing the operation of the slope assisted Brillouin Optical Time-Domain Analysis (BOTDA) method, it comes out that the acquisition rate is practically limited by two fundamental factors: the polarization scrambling frequency and the phase noise from the laser. As regards polarization scrambling, we show experimentally that the scrambling frequency poses a limit on the maximum acquisition rate for a given averaging factor. As regards phase noise, we show numerically and experimentally that the slope assisted method is particularly sensitive to the laser phase noise, due to the specific positioning of the pump-probe frequency shift on the Brillouin Gain Spectrum (BGS).

  14. Trends in sensitivity analysis practice in the last decade.

    PubMed

    Ferretti, Federico; Saltelli, Andrea; Tarantola, Stefano

    2016-10-15

    The majority of published sensitivity analyses (SAs) are either local or one factor-at-a-time (OAT) analyses, relying on unjustified assumptions of model linearity and additivity. Global approaches to sensitivity analyses (GSA) which would obviate these shortcomings, are applied by a minority of researchers. By reviewing the academic literature on SA, we here present a bibliometric analysis of the trends of different SA practices in last decade. The review has been conducted both on some top ranking journals (Nature and Science) and through an extended analysis in the Elsevier's Scopus database of scientific publications. After correcting for the global growth in publications, the amount of papers performing a generic SA has notably increased over the last decade. Even if OAT is still the most largely used technique in SA, there is a clear increase in the use of GSA with preference respectively for regression and variance-based techniques. Even after adjusting for the growth of publications in the sole modelling field, to which SA and GSA normally apply, the trend is confirmed. Data about regions of origin and discipline are also briefly discussed. The results above are confirmed when zooming on the sole articles published in chemical modelling, a field historically proficient in the use of SA methods. PMID:26934843

  15. Trends in sensitivity analysis practice in the last decade.

    PubMed

    Ferretti, Federico; Saltelli, Andrea; Tarantola, Stefano

    2016-10-15

    The majority of published sensitivity analyses (SAs) are either local or one factor-at-a-time (OAT) analyses, relying on unjustified assumptions of model linearity and additivity. Global approaches to sensitivity analyses (GSA) which would obviate these shortcomings, are applied by a minority of researchers. By reviewing the academic literature on SA, we here present a bibliometric analysis of the trends of different SA practices in last decade. The review has been conducted both on some top ranking journals (Nature and Science) and through an extended analysis in the Elsevier's Scopus database of scientific publications. After correcting for the global growth in publications, the amount of papers performing a generic SA has notably increased over the last decade. Even if OAT is still the most largely used technique in SA, there is a clear increase in the use of GSA with preference respectively for regression and variance-based techniques. Even after adjusting for the growth of publications in the sole modelling field, to which SA and GSA normally apply, the trend is confirmed. Data about regions of origin and discipline are also briefly discussed. The results above are confirmed when zooming on the sole articles published in chemical modelling, a field historically proficient in the use of SA methods.

  16. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  17. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  18. Novel methods for spectral analysis

    NASA Astrophysics Data System (ADS)

    Roy, R.; Sumpter, B. G.; Pfeffer, G. A.; Gray, S. K.; Noid, D. W.

    1991-06-01

    In this review article, various techniques for obtaining estimates of parameters related to the spectrum of an underlying process are discussed. These techniques include the conventional nonparametric FFT approach and more recently developed parametric techniques such as maximum entropy, MUSIC, and ESPRIT, the latter two being classified as signal-subspace or eigenvector techniques. These estimators span the spectrum of possible estimators in that extremes of a priori knowledge are assumed (nonparametric versus parametric) and extremes in the underlying model of the observed process (deterministic versus stochastic) are involved. The advantage of parametric techniques is their ability to provide very accurate estimates using data from extremely short time intervals. Several applications of these novel methods for frequency analysis of very short time data are presented. These include calculation of dispersion curves, and the density of vibrational states g(ω) for many-body systems, semiclassical transition frequencies, overtone linewidths, and resonance energies of the time-dependent Schrödinger equation for few-body problems.

  19. A practical method to obtain reproducible binocular electroretinograms in dogs.

    PubMed

    Rosolen, Serge Georges; Rigaudiere, Florence; Lachapelle, Pierre

    2002-09-01

    We present a simple method to record highly reproduciblebinocular electroretinograms in dogs. Rod and cone electroretinograms were elicited in 60 Beagle dogs, with the use of two adjustable photostimulators, one directed at each eye, and maintained in position with the use of a special device. Data analysis revealed no significant differences in amplitudes between the right and the left eye for each animal and each recording session, thus attesting to the high reliability of our approach. In a few instances, however, small interocular timing differences were noted. The proposed approach could therefore be used not only in a clinical setup where the functional status of the two eyes is often needed to reach a diagnosis butalso in research projects, such as toxicological assays, where the experimentation is performed on one eye while the other is kept as control. PMID:12462439

  20. Key steps in the strategic analysis of a dental practice.

    PubMed

    Armstrong, J L; Boardman, A E; Vining, A R

    1999-01-01

    As dentistry is becoming increasingly competitive, dentists must focus more on strategic analysis. This paper lays out seven initial steps that are the foundation of strategic analysis. It introduces and describes the use of service-customer matrices and location-proximity maps as tools in competitive positioning. The paper also contains a brief overview of the role of differentiation and cost-control in determining key success factors for dental practices.

  1. Procedural Fidelity: An Analysis of Measurement and Reporting Practices

    ERIC Educational Resources Information Center

    Ledford, Jennifer R.; Wolery, Mark

    2013-01-01

    A systematic analysis was conducted of measurement and reporting practices related to procedural fidelity in single-case research for the past 30 years. Previous reviews of fidelity primarily reported whether fidelity data were collected by authors; these reviews reported that collection was variable, but low across journals and over time. Results…

  2. [Practice analysis: culture shock and adaptation at work].

    PubMed

    Philippe, Séverine; Didry, Pascale

    2015-12-01

    Constructed as a practice analysis, this personal account presents the reflection undertaken by a student on placement in Ireland thanks to the Erasmus programme. She describes in detail the stages of her adaptation in a hospital setting which is considerably different to her usual environment. PMID:26654501

  3. A Model of Practice in Special Education: Dynamic Ecological Analysis

    ERIC Educational Resources Information Center

    Hannant, Barbara; Lim, Eng Leong; McAllum, Ruth

    2010-01-01

    Dynamic Ecological Analysis (DEA) is a model of practice which increases a teams' efficacy by enabling the development of more effective interventions through collaboration and collective reflection. This process has proved to be useful in: a) clarifying thinking and problem-solving, b) transferring knowledge and thinking to significant parties,…

  4. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    ERIC Educational Resources Information Center

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we…

  5. Practicing oncology in provincial Mexico: a narrative analysis.

    PubMed

    Hunt, L M

    1994-03-01

    This paper examines the discourse of oncologists treating cancer in a provincial capital of southern Mexico. Based on an analysis of both formal interviews and observations of everyday clinical practice, it examines a set of narrative themes they used to maintain a sense of professionalism and possibility as they endeavored to apply a highly technologically dependent biomedical model in a resource-poor context. They moved between coexisting narrative frameworks as they addressed their formidable problems of translating between theory and practice. In a biomedical narrative frame, they drew on biomedical theory to produce a model of cellular dysfunction and of clinical intervention. However, limited availability of diagnostic and treatment techniques and patients inability or unwillingness to comply, presented serious constraints to the application of this model. They used a practical narrative frame to discuss the socio-economic issues they understood to be underlying these limitations to their clinical practice. They did not experience the incongruity between theory and practice as a continual challenge to their biomedical model, nor to their professional competency. Instead, through a reconciling narrative frame, they mediated this conflict. In this frame, they drew on culturally specific concepts of moral rightness and order to produce accounts that minimized the problem, exculpated themselves and cast blame for failed diagnosis and treatment. By invoking these multiple, coexisting narrative themes, the oncologists sustained an open vision of their work in which deficiencies and impotency were vindicated, and did not stand in the way of clinical practice. PMID:8184335

  6. Coal Field Fire Fighting - Practiced methods, strategies and tactics

    NASA Astrophysics Data System (ADS)

    Wündrich, T.; Korten, A. A.; Barth, U. H.

    2009-04-01

    achieved. For an effective and efficient fire fighting optimal tactics are requiered and can be divided into four fundamental tactics to control fire hazards: - Defense (digging away the coal, so that the coal can not begin to burn; or forming a barrier, so that the fire can not reach the not burning coal), - Rescue the coal (coal mining of a not burning seam), - Attack (active and direct cooling of burning seam), - Retreat (only monitoring till self-extinction of a burning seam). The last one is used when a fire exceeds the organizational and/or technical scope of a mission. In other words, "to control a coal fire" does not automatically and in all situations mean "to extinguish a coal fire". Best-practice tactics or a combination of them can be selected for control of a particular coal fire. For the extinguishing works different extinguishing agents are available. They can be applied by different application techniques and varying distinctive operating expenses. One application method may be the drilling of boreholes from the surface or covering the surface with low permeability soils. The mainly used extinction agents for coal field fire are as followed: Water (with or without additives), Slurry, Foaming mud/slurry, Inert gases, Dry chemicals and materials and Cryogenic agents. Because of its tremendous dimension and its complexity the worldwide challenge of coal fires is absolutely unique - it can only be solved with functional application methods, best fitting strategies and tactics, organisation and research as well as the dedication of the involved fire fighters, who work under extreme individual risks on the burning coal fields.

  7. Efficient methods and practical guidelines for simulating isotope effects

    NASA Astrophysics Data System (ADS)

    Ceriotti, Michele; Markland, Thomas E.

    2013-01-01

    The shift in chemical equilibria due to isotope substitution is frequently exploited to obtain insight into a wide variety of chemical and physical processes. It is a purely quantum mechanical effect, which can be computed exactly using simulations based on the path integral formalism. Here we discuss how these techniques can be made dramatically more efficient, and how they ultimately outperform quasi-harmonic approximations to treat quantum liquids not only in terms of accuracy, but also in terms of computational cost. To achieve this goal we introduce path integral quantum mechanics estimators based on free energy perturbation, which enable the evaluation of isotope effects using only a single path integral molecular dynamics trajectory of the naturally abundant isotope. We use as an example the calculation of the free energy change associated with H/D and 16O/18O substitutions in liquid water, and of the fractionation of those isotopes between the liquid and the vapor phase. In doing so, we demonstrate and discuss quantitatively the relative benefits of each approach, thereby providing a set of guidelines that should facilitate the choice of the most appropriate method in different, commonly encountered scenarios. The efficiency of the estimators we introduce and the analysis that we perform should in particular facilitate accurate ab initio calculation of isotope effects in condensed phase systems.

  8. Efficient methods and practical guidelines for simulating isotope effects.

    PubMed

    Ceriotti, Michele; Markland, Thomas E

    2013-01-01

    The shift in chemical equilibria due to isotope substitution is frequently exploited to obtain insight into a wide variety of chemical and physical processes. It is a purely quantum mechanical effect, which can be computed exactly using simulations based on the path integral formalism. Here we discuss how these techniques can be made dramatically more efficient, and how they ultimately outperform quasi-harmonic approximations to treat quantum liquids not only in terms of accuracy, but also in terms of computational cost. To achieve this goal we introduce path integral quantum mechanics estimators based on free energy perturbation, which enable the evaluation of isotope effects using only a single path integral molecular dynamics trajectory of the naturally abundant isotope. We use as an example the calculation of the free energy change associated with H/D and (16)O/(18)O substitutions in liquid water, and of the fractionation of those isotopes between the liquid and the vapor phase. In doing so, we demonstrate and discuss quantitatively the relative benefits of each approach, thereby providing a set of guidelines that should facilitate the choice of the most appropriate method in different, commonly encountered scenarios. The efficiency of the estimators we introduce and the analysis that we perform should in particular facilitate accurate ab initio calculation of isotope effects in condensed phase systems. PMID:23298033

  9. Prognostic Analysis System and Methods of Operation

    NASA Technical Reports Server (NTRS)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)

    2014-01-01

    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  10. A Task-Based Needs Analysis: Putting Principles into Practice

    ERIC Educational Resources Information Center

    Lambert, Craig

    2010-01-01

    This study triangulates multiple data sources and methods to build a consensus on the English-language tasks faced by graduates in their lives and careers as a practical basis for L2 program development. It addresses a problem similar to what West (1994) refers to as TENOR (Teaching English for No Obvious Reason). TENOR is problematic in that it…

  11. Practical Aspects of the Equation-Error Method for Aircraft Parameter Estimation

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene a.

    2006-01-01

    Various practical aspects of the equation-error approach to aircraft parameter estimation were examined. The analysis was based on simulated flight data from an F-16 nonlinear simulation, with realistic noise sequences added to the computed aircraft responses. This approach exposes issues related to the parameter estimation techniques and results, because the true parameter values are known for simulation data. The issues studied include differentiating noisy time series, maximum likelihood parameter estimation, biases in equation-error parameter estimates, accurate computation of estimated parameter error bounds, comparisons of equation-error parameter estimates with output-error parameter estimates, analyzing data from multiple maneuvers, data collinearity, and frequency-domain methods.

  12. Teaching the Best Practice Way: Methods That Matter, K-12

    ERIC Educational Resources Information Center

    Daniels, Harvey; Bizar, Marilyn

    2004-01-01

    Everyone talks about "best practice" teaching--what does it actually look like in the classroom? How do working teachers translate complex curriculum standards into simple, workable classroom structures that embody exemplary instruction--and still let kids find joy in learning? In this book, the authors present seven basic teaching structures that…

  13. Parallel Processable Cryptographic Methods with Unbounded Practical Security.

    ERIC Educational Resources Information Center

    Rothstein, Jerome

    Addressing the problem of protecting confidential information and data stored in computer databases from access by unauthorized parties, this paper details coding schemes which present such astronomical work factors to potential code breakers that security breaches are hopeless in any practical sense. Two procedures which can be used to encode for…

  14. Learning by the Case Method: Practical Approaches for Community Leaders.

    ERIC Educational Resources Information Center

    Stenzel, Anne K.; Feeney, Helen M.

    This supplement to Volunteer Training and Development: A Manual for Community Groups, provides practical guidance in the selection, writing, and adaptation of effective case materials for specific educational objectives, and develops suitable cases for use by analyzing concrete situations and by offering illustrations of various types. An…

  15. Convergence analysis of combinations of different methods

    SciTech Connect

    Kang, Y.

    1994-12-31

    This paper provides a convergence analysis for combinations of different numerical methods for solving systems of differential equations. The author proves that combinations of two convergent linear multistep methods or Runge-Kutta methods produce a new convergent method of which the order is equal to the smaller order of the two original methods.

  16. Trial Sequential Methods for Meta-Analysis

    ERIC Educational Resources Information Center

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  17. Convex geometry analysis method of hyperspectral data

    NASA Astrophysics Data System (ADS)

    Gong, Yanjun; Wang, XiChang; Qi, Hongxing; Yu, BingXi

    2003-06-01

    We present matrix expression of convex geometry analysis method of hyperspectral data by linear mixing model and establish a mathematic model of endmembers. A 30-band remote sensing image is applied to testify the model. The results of analysis reveal that the method can analyze mixed pixel questions. The targets that are smaller than earth surface pixel can be identified by applying the method.

  18. A Meta-Analysis of Published School Social Work Practice Studies: 1980-2007

    ERIC Educational Resources Information Center

    Franklin, Cynthia; Kim, Johnny S.; Tripodi, Stephen J.

    2009-01-01

    Objective: This systematic review examined the effectiveness of school social work practices using meta-analytic techniques. Method: Hierarchical linear modeling software was used to calculate overall effect size estimates as well as test for between-study variability. Results: A total of 21 studies were included in the final analysis.…

  19. A Practical Method for UHF RFID Interrogation Area Measurement Using Battery Assisted Passive Tag

    NASA Astrophysics Data System (ADS)

    Mitsugi, Jin; Tokumasu, Osamu

    For the success of a large deployment of UHF RFID, easyto-use and low-cost engineering tools to facilitate the performance evaluation are demanded particularly in installations and for trouble shooting. The measurement of interrogation area is one of the most typical industrial demands to establish the stable readability of UHF RFID. Exhaustive repetition of tag position change with a read operation and a usage of expensive measurement equipment or special interrogators are common practices to measure the interrogation area. In this paper, a practical method to measure the interrogation area of a UHF RFID by using a battery assisted passive tag (BAP) is presented. After introducing the fundamental design and performances of the BAP that we have developed, we introduce the measurement method. In the method, the target tag in the target installation is continuously traversed either manually or automatically while it is subjected to a repetitive read of a commercial interrogator. During the target tag traversal, the interrogator's commands are continuously monitored by a BAP. With an extensive analysis on interrogator commands, the BAP can differentiate between its own read timings and those of the target tag. The read timings of the target tag collected by the BAP are recorded synchronously with the target tag position, yielding a map of the interrogation area. The present method does not entail a measurement burden. It is also independent of the choice of interrogator and tag. The method is demonstrated in a practical UHF RFID installation to show that the method can measure a 40mm resolution interrogation area measurement just by traversing the target tag at a slow walking speed, 300mm/sec.

  20. Root Cause Analysis: Methods and Mindsets.

    ERIC Educational Resources Information Center

    Kluch, Jacob H.

    This instructional unit is intended for use in training operations personnel and others involved in scram analysis at nuclear power plants in the techniques of root cause analysis. Four lessons are included. The first lesson provides an overview of the goals and benefits of the root cause analysis method. Root cause analysis techniques are covered…

  1. Cost Analysis: Methods and Realities.

    ERIC Educational Resources Information Center

    Cummings, Martin M.

    1989-01-01

    Argues that librarians need to be concerned with cost analysis of library functions and services because, in the allocation of resources, decision makers will favor library managers who demonstrate understanding of the relationships between costs and productive outputs. Factors that should be included in a reliable scheme for cost accounting are…

  2. Invisible nursing research: thoughts about mixed methods research and nursing practice.

    PubMed

    Fawcett, Jacqueline

    2015-04-01

    In this this essay, the author addresses the close connection between mixed methods research and nursing practice. If the assertion that research and practice are parallel processes is accepted, then nursing practice may be considered "invisible mixed methods research," in that almost every encounter between a nurse and a patient involves collection and integration of qualitative (word) and quantitative (number) information that actually is single-case mixed methods research.

  3. Governance of professional nursing practice in a hospital setting: a mixed methods study1

    PubMed Central

    dos Santos, José Luís Guedes; Erdmann, Alacoque Lorenzini

    2015-01-01

    Objective: to elaborate an interpretative model for the governance of professional nursing practice in a hospital setting. Method: a mixed methods study with concurrent triangulation strategy, using data from a cross-sectional study with 106 nurses and a Grounded Theory study with 63 participants. The quantitative data were collected through the Brazilian Nursing Work Index - Revised and underwent descriptive statistical analysis. Qualitative data were obtained from interviews and analyzed through initial, selective and focused coding. Results: based on the results obtained with the Brazilian Nursing Work Index - Revised, it is possible to state that nurses perceived that they had autonomy, control over the environment, good relationships with physicians and organizational support for nursing governance. The governance of the professional nursing practice is based on the management of nursing care and services carried out by the nurses. To perform these tasks, nurses aim to get around the constraints of the organizational support and develop management knowledge and skills. Conclusion: it is important to reorganize the structures and processes of nursing governance, especially the support provided by the organization for the management practices of nurses. PMID:26625992

  4. Hybrid methods for cybersecurity analysis :

    SciTech Connect

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling and analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and

  5. Methods of imparting information to patients in dental practice.

    PubMed

    Hein, W

    1984-03-01

    Changes are taking place in dentistry now that we know much more about the causes of caries, periodontal disease and most jaw malformations and that, through education and motivation, our patients can substantially protect themselves from these disorders. In our specialty in particular, it is true that 'prevention is better than cure'! This approach will be successful only if our patients are monitored on a regular recall basis. Attention is drawn to the existence of specific target groups. Successful results can only be achieved if individual efforts are backed by group and population prophylactic measures on the part of national or local authorities using the expertise of the dental profession. A preventively orientated practice must offer much more patient education than one which concentrates on the provision of curative services. Details of the approach to be followed should be thoroughly planned and the members of the dental team to be responsible for the tasks concerned should be identified. Appropriate equipment and working facilities are essential for the effective conduct of preventive measures. It must be emphasized that these measures in dental practice involve high costs. In attending to our patients, we should not forget that not only practical intervention but also the provision of advice is a vitally important health service.

  6. Microparticle analysis system and method

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    2007-01-01

    A device for analyzing microparticles is provided which includes a chamber with an inlet and an outlet for respectively introducing and dispensing a flowing fluid comprising microparticles, a light source for providing light through the chamber and a photometer for measuring the intensity of light transmitted through individual microparticles. The device further includes an imaging system for acquiring images of the fluid. In some cases, the device may be configured to identify and determine a quantity of the microparticles within the fluid. Consequently, a method for identifying and tracking microparticles in motion is contemplated herein. The method involves flowing a fluid comprising microparticles in laminar motion through a chamber, transmitting light through the fluid, measuring the intensities of the light transmitted through the microparticles, imaging the fluid a plurality of times and comparing at least some of the intensities of light between different images of the fluid.

  7. A mixed-methods approach to investigating the adoption of evidence-based pain practices in nursing homes.

    PubMed

    Ersek, Mary; Jablonski, Anita

    2014-07-01

    This mixed methods study examined perceived facilitators and obstacles to adopting evidence-based pain management protocols vis-a-vis documented practice changes that were measured using a chart audit tool. This analysis used data from a subgroup of four nursing homes that participated in a clinical trial. Focus group interviews with staff yielded qualitative data about perceived factors that affected their willingness and ability to use the protocols. Chart audits determined whether pain assessment and management practices changed over time in light of these reported facilitators and barriers. Reported facilitators included administrative support, staff consistency, and policy and procedure changes. Barriers were staff attitudes, regulatory issues, and provider mistrust of nurses' judgment. Overall, staff reported improvements in pain practices. These reports were corroborated by modest but significant increases in adherence to recommended practices. Change in clinical practice is complex and requires attention to both structural and process aspects of care. PMID:24640959

  8. A mixed-methods approach to investigating the adoption of evidence-based pain practices in nursing homes.

    PubMed

    Ersek, Mary; Jablonski, Anita

    2014-07-01

    This mixed methods study examined perceived facilitators and obstacles to adopting evidence-based pain management protocols vis-a-vis documented practice changes that were measured using a chart audit tool. This analysis used data from a subgroup of four nursing homes that participated in a clinical trial. Focus group interviews with staff yielded qualitative data about perceived factors that affected their willingness and ability to use the protocols. Chart audits determined whether pain assessment and management practices changed over time in light of these reported facilitators and barriers. Reported facilitators included administrative support, staff consistency, and policy and procedure changes. Barriers were staff attitudes, regulatory issues, and provider mistrust of nurses' judgment. Overall, staff reported improvements in pain practices. These reports were corroborated by modest but significant increases in adherence to recommended practices. Change in clinical practice is complex and requires attention to both structural and process aspects of care.

  9. Current status of methods for shielding analysis

    SciTech Connect

    Engle, W.W.

    1980-01-01

    Current methods used in shielding analysis and recent improvements in those methods are discussed. The status of methods development is discussed based on needs cited at the 1977 International Conference on Reactor Shielding. Additional areas where methods development is needed are discussed.

  10. Comparison of Manual Versus Automated Data Collection Method for an Evidence-Based Nursing Practice Study

    PubMed Central

    Byrne, M.D.; Jordan, T.R.; Welle, T.

    2013-01-01

    Objective The objective of this study was to investigate and improve the use of automated data collection procedures for nursing research and quality assurance. Methods A descriptive, correlational study analyzed 44 orthopedic surgical patients who were part of an evidence-based practice (EBP) project examining post-operative oxygen therapy at a Midwestern hospital. The automation work attempted to replicate a manually-collected data set from the EBP project. Results Automation was successful in replicating data collection for study data elements that were available in the clinical data repository. The automation procedures identified 32 “false negative” patients who met the inclusion criteria described in the EBP project but were not selected during the manual data collection. Automating data collection for certain data elements, such as oxygen saturation, proved challenging because of workflow and practice variations and the reliance on disparate sources for data abstraction. Automation also revealed instances of human error including computational and transcription errors as well as incomplete selection of eligible patients. Conclusion Automated data collection for analysis of nursing-specific phenomenon is potentially superior to manual data collection methods. Creation of automated reports and analysis may require initial up-front investment with collaboration between clinicians, researchers and information technology specialists who can manage the ambiguities and challenges of research and quality assurance work in healthcare. PMID:23650488

  11. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2012-05-29

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  12. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  13. Practical thoughts on cost-benefit analysis and health services.

    PubMed

    Burchell, A; Weeden, R

    1982-08-01

    Cost-benefit analysis is fast becoming--if it is not already--an essential tool in decision making. It is, however, a complex subject, and one in which few doctors have been trained. This paper offers practical thoughts on the art of cost-benefit analysis, and is written for clinicians and other medical specialists who, though inexpert in the techniques of accountancy, nevertheless wish to carry out their own simple analyses in a manner that will enable them, and others, to take effective decisions.

  14. Scharz Preconditioners for Krylov Methods: Theory and Practice

    SciTech Connect

    Szyld, Daniel B.

    2013-05-10

    Several numerical methods were produced and analyzed. The main thrust of the work relates to inexact Krylov subspace methods for the solution of linear systems of equations arising from the discretization of partial di erential equa- tions. These are iterative methods, i.e., where an approximation is obtained and at each step. Usually, a matrix-vector product is needed at each iteration. In the inexact methods, this product (or the application of a preconditioner) can be done inexactly. Schwarz methods, based on domain decompositions, are excellent preconditioners for thise systems. We contributed towards their under- standing from an algebraic point of view, developed new ones, and studied their performance in the inexact setting. We also worked on combinatorial problems to help de ne the algebraic partition of the domains, with the needed overlap, as well as PDE-constraint optimization using the above-mentioned inexact Krylov subspace methods.

  15. A Practical Guide to Interpretation of Large Collections of Incident Narratives Using the QUORUM Method

    NASA Technical Reports Server (NTRS)

    McGreevy, Michael W.

    1997-01-01

    Analysis of incident reports plays an important role in aviation safety. Typically, a narrative description, written by a participant, is a central part of an incident report. Because there are so many reports, and the narratives contain so much detail, it can be difficult to efficiently and effectively recognize patterns among them. Recognizing and addressing recurring problems, however, is vital to continuing safety in commercial aviation operations. A practical way to interpret large collections of incident narratives is to apply the QUORUM method of text analysis, modeling, and relevance ranking. In this paper, QUORUM text analysis and modeling are surveyed, and QUORUM relevance ranking is described in detail with many examples. The examples are based on several large collections of reports from the Aviation Safety Reporting System (ASRS) database, and a collection of news stories describing the disaster of TWA Flight 800, the Boeing 747 which exploded in mid- air and crashed near Long Island, New York, on July 17, 1996. Reader familiarity with this disaster should make the relevance-ranking examples more understandable. The ASRS examples illustrate the practical application of QUORUM relevance ranking.

  16. Propel: Tools and Methods for Practical Source Code Model Checking

    NASA Technical Reports Server (NTRS)

    Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem

    2003-01-01

    The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

  17. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures.

    PubMed

    Rock, Adam J; Coventry, William L; Morgan, Methuen I; Loi, Natasha M

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology. PMID:27014147

  18. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures.

    PubMed

    Rock, Adam J; Coventry, William L; Morgan, Methuen I; Loi, Natasha M

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology.

  19. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures

    PubMed Central

    Rock, Adam J.; Coventry, William L.; Morgan, Methuen I.; Loi, Natasha M.

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology. PMID:27014147

  20. Grounded Theory in Practice: Is It Inherently a Mixed Method?

    ERIC Educational Resources Information Center

    Johnson, R. B.; McGowan, M. W.; Turner, L. A.

    2010-01-01

    We address 2 key points of contention in this article. First, we engage the debate concerning whether particular methods are necessarily linked to particular research paradigms. Second, we briefly describe a mixed methods version of grounded theory (MM-GT). Grounded theory can be tailored to work well in any of the 3 major forms of mixed methods…

  1. Vibration analysis methods for piping

    NASA Astrophysics Data System (ADS)

    Gibert, R. J.

    1981-09-01

    Attention is given to flow vibrations in pipe flow induced by singularity points in the piping system. The types of pressure fluctuations induced by flow singularities are examined, including the intense wideband fluctuations immediately downstream of the singularity and the acoustic fluctuations encountered in the remainder of the circuit, and a theory of noise generation by unsteady flow in internal acoustics is developed. The response of the piping systems to the pressure fluctuations thus generated is considered, and the calculation of the modal characteristics of piping containing a dense fluid in order to obtain the system transfer function is discussed. The TEDEL program, which calculates the vibratory response of a structure composed of straight and curved pipes with variable mechanical characteristics forming a three-dimensional network by a finite element method, is then presented, and calculations of fluid-structural coupling in tubular networks are illustrated.

  2. Estimating free-living human energy expenditure: Practical aspects of the doubly labeled water method and its applications

    PubMed Central

    Kazuko, Ishikawa-Takata; Kim, Eunkyung; Kim, Jeonghyun; Yoon, Jinsook

    2014-01-01

    The accuracy and noninvasive nature of the doubly labeled water (DLW) method makes it ideal for the study of human energy metabolism in free-living conditions. However, the DLW method is not always practical in many developing and Asian countries because of the high costs of isotopes and equipment for isotope analysis as well as the expertise required for analysis. This review provides information about the theoretical background and practical aspects of the DLW method, including optimal dose, basic protocols of two- and multiple-point approaches, experimental procedures, and isotopic analysis. We also introduce applications of DLW data, such as determining the equations of estimated energy requirement and validation studies of energy intake. PMID:24944767

  3. Methods for the practical determination of the mechanical strength of tablets--from empiricism to science.

    PubMed

    Podczeck, Fridrun

    2012-10-15

    This review aims to awake an interest in the determination of the tensile strength of tablets of various shapes using a variety of direct and indirect test methods. The United States Pharmacopoeia monograph 1217 (USP35/NF30, 2011) has provided a very good approach to the experimental determination of and standards for the mechanical strength of tablets. Building on this monograph, it is hoped that the detailed account of the various methods provided in this review will encourage industrial and academic scientists involved in the development and manufacture of tablet formulations to take a step forward and determine the tensile strength of tablets, even if these are not simply flat disc-shaped or rectangular. To date there are a considerable number of valid test configurations and stress equations available, catering for many of the various shapes of tablets on the market. The determination of the tensile strength of tablets should hence replace the sole determination of a breaking force, because tensile strength values are more comparable and suggestions for minimum and/or maximum values are available. The review also identifies the gaps that require urgent filling. There is also a need for further analysis using, for example, Finite Element Method, to provide correct stress solutions for tablets of differing shapes, but this also requires practical experiments to find the best loading conditions, and theoretical stress solutions should be verified with practical experiments.

  4. Probabilistic structural analysis by extremum methods

    NASA Technical Reports Server (NTRS)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  5. Learning Practice-Based Research Methods: Capturing the Experiences of MSW Students

    ERIC Educational Resources Information Center

    Natland, Sidsel; Weissinger, Erika; Graaf, Genevieve; Carnochan, Sarah

    2016-01-01

    The literature on teaching research methods to social work students identifies many challenges, such as dealing with the tensions related to producing research relevant to practice, access to data to teach practice-based research, and limited student interest in learning research methods. This is an exploratory study of the learning experiences of…

  6. Investigating the Efficacy of Practical Skill Teaching: A Pilot-Study Comparing Three Educational Methods

    ERIC Educational Resources Information Center

    Maloney, Stephen; Storr, Michael; Paynter, Sophie; Morgan, Prue; Ilic, Dragan

    2013-01-01

    Effective education of practical skills can alter clinician behaviour, positively influence patient outcomes, and reduce the risk of patient harm. This study compares the efficacy of two innovative practical skill teaching methods, against a traditional teaching method. Year three pre-clinical physiotherapy students consented to participate in a…

  7. Assessing Student Perception of Practice Evaluation Knowledge in Introductory Research Methods

    ERIC Educational Resources Information Center

    Baker, Lisa R.; Pollio, David E.; Hudson, Ashley

    2011-01-01

    The authors explored the use of the Practice Evaluation Knowledge Scale (PEKS) to assess student perception of acquisition and retention of practice evaluation knowledge from an undergraduate research methods class. The authors sampled 2 semesters of undergraduate social work students enrolled in an introductory research methods course.…

  8. Canonical Correlation Analysis: An Explanation with Comments on Correct Practice.

    ERIC Educational Resources Information Center

    Thompson, Bruce

    This paper briefly explains the logic underlying the basic calculations employed in canonical correlation analysis. A small hypothetical data set is employed to illustrate that canonical correlation analysis subsumes both univariate and multivariate parametric methods. Several real data sets are employed to illustrate other themes. Three common…

  9. Text analysis methods, text analysis apparatuses, and articles of manufacture

    DOEpatents

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  10. Theory, Method and Practice of Neuroscientific Findings in Science Education

    ERIC Educational Resources Information Center

    Liu, Chia-Ju; Chiang, Wen-Wei

    2014-01-01

    This report provides an overview of neuroscience research that is applicable for science educators. It first offers a brief analysis of empirical studies in educational neuroscience literature, followed by six science concept learning constructs based on the whole brain theory: gaining an understanding of brain function; pattern recognition and…

  11. Honesty in critically reflective essays: an analysis of student practice.

    PubMed

    Maloney, Stephen; Tai, Joanna Hong-Meng; Lo, Kristin; Molloy, Elizabeth; Ilic, Dragan

    2013-10-01

    In health professional education, reflective practice is seen as a potential means for self-improvement from everyday clinical encounters. This study aims to examine the level of student honesty in critical reflection, and barriers and facilitators for students engaging in honest reflection. Third year physiotherapy students, completing summative reflective essays on clinical encounters using the modified Gibbs cycle, were invited to participate in an anonymous online survey. Student knowledge and beliefs about reflective practice, and disclosure of the truthfulness of their reflections, were assessed using a mixed method approach. A total of 34 students, from a maximum possible of 48 (71 %), participated in the study activities. A total of 68 % stated that they were at least 80 % truthful about their experiences. There was general student consensus that reflective practice was important for their growth as a clinician. Students questioned the belief that the reflection needed to be based on a factual experience. Reflective practice can be a valuable addition to the clinical education of health care professionals, although this value can be diminished through dishonest reflections if it is not carefully implemented. Student influences on honest reflection include; (1) the design of any assessment criteria, and (2) student knowledge and competency in applying critical reflection.

  12. Matrix methods for bare resonator eigenvalue analysis.

    PubMed

    Latham, W P; Dente, G C

    1980-05-15

    Bare resonator eigenvalues have traditionally been calculated using Fox and Li iterative techniques or the Prony method presented by Siegman and Miller. A theoretical framework for bare resonator eigenvalue analysis is presented. Several new methods are given and compared with the Prony method.

  13. [Pedagogical practices in nursing teaching: a study from the perspective of institutional analysis].

    PubMed

    Pereira, Wilza Rocha; Tavares, Cláudia Mara Melo

    2010-12-01

    The general objective of this study was to learn about the pedagogical practices that are already in use in nursing teaching in order to identify and analyze those that have brought changes and innovation. This field study used a qualitative and comparative approach, and the subjects were nursing professors and students. The data was collected through individual interviews and focal groups. Data analysis was based on the Institutional Analysis method. Several pedagogical practices were recognized, from the most traditional to those considered innovative, and it was noticed that changes are already present and are part of a set of elements caused by the obsolescence of values that are now considered to be insufficient or inappropriate by professors themselves. The study revealed that the activity of teaching and the qualification of the pedagogical practices are always desired by professors. PMID:21337793

  14. Choosing a framework for ethical analysis in advanced practice settings: the case for casuistry.

    PubMed

    Artnak, K E; Dimmitt, J H

    1996-02-01

    The need for advanced practice nurses to incorporate ethical analysis into case management is becoming more apparent -- particularly for the increasingly independent practice settings of psychiatric and mental health nursing. The nursing literature contains many articles dealing with the more abstract treatment of clinical ethics, but for the practitioner there is unfortunately little information available that uses ethical principles in a practical framework, which addresses the concrete reality of daily, difficult clinical decision making. This article applies a model of reasoned analysis to an actual case study using the concepts of casuistry or case-based reasoning. This method offers an alternative to the more popular paradigm of principilism. Complicated by the existence of violence and abuse, the case examines several ethical issues including patient privacy and legitimate breaches of patient confidentiality.

  15. Methods in Educational Research: From Theory to Practice

    ERIC Educational Resources Information Center

    Lodico, Marguerite G.; Spaulding Dean T.; Voegtle, Katherine H.

    2006-01-01

    Written for students, educators, and researchers, "Methods in Educational Research" offers a refreshing introduction to the principles of educational research. Designed for the real world of educational research, the book's approach focuses on the types of problems likely to be encountered in professional experiences. Reflecting the importance of…

  16. Practice and Progression in Second Language Research Methods

    ERIC Educational Resources Information Center

    Mackey, Alison

    2014-01-01

    Since its inception, the field of second language research has utilized methods from a number of areas, including general linguistics, psychology, education, sociology, anthropology and, recently, neuroscience and corpus linguistics. As the questions and objectives expand, researchers are increasingly pushing methodological boundaries to gain a…

  17. Focus Group Method And Methodology: Current Practice And Recent Debate

    ERIC Educational Resources Information Center

    Parker, Andrew; Tritter, Jonathan

    2006-01-01

    This paper considers the contemporary use of focus groups as a method of data collection within qualitative research settings. The authors draw upon their own experiences of using focus groups in educational and "community" user-group environments in order to provide an overview of recent issues and debates surrounding the deployment of focus…

  18. Educational Delivery Methods to Encourage Adoption of Sustainable Agricultural Practices.

    ERIC Educational Resources Information Center

    Gamon, Julia; And Others

    1994-01-01

    Comparison of 143 farmers who attended sustainable agriculture conferences (76% response) with 143 controls (57% response) found no significant differences between the 2 groups, suggesting a need to change delivery methods for extension programming. Chemical dealers were the top source of information for both groups. (SK)

  19. Practical method of diffusion-welding steel plate in air

    NASA Technical Reports Server (NTRS)

    Holko, K. H.; Moore, T. J.

    1971-01-01

    Method is ideal for critical service requirements where parent metal properties are equaled in notch toughness, stress rupture and other characteristics. Welding technique variations may be used on a variety of materials, such as carbon steels, alloy steels, stainless steels, ceramics, and reactive and refractory materials.

  20. The AB Initio Mia Method: Theoretical Development and Practical Applications

    NASA Astrophysics Data System (ADS)

    Peeters, Anik

    The bottleneck in conventional ab initio Hartree -Fock calculations is the storage of the electron repulsion integrals because their number increases with the fourth power of the number of basis functions. This problem can be solved by a combination of the multiplicative integral approximation (MIA) and the direct SCF method. The MIA approach was successfully applied in the geometry optimisation of some biologically interesting compounds like the neurolepticum Haloperidol and two TIBO derivatives, inactivators of HIV1. In this thesis the potency of the MIA-method is shown by the application of this method in the calculation of the forces on the nuclei. In addition, the MIA method enabled the development of a new model for performing crystal field studies: the supermolecule model. The results for this model are in better agreement with experimental data than the results for the point charge model. This is illustrated by the study of some small molecules in the solid state: 2,3-diketopiperazine, formamide oxime and two polymorphic forms of glycine, alpha-glycine and beta-glycine.

  1. Theories, methods, and practice on the National Atlases of China

    NASA Astrophysics Data System (ADS)

    Qi, Qingwen

    2007-06-01

    The history of editing National Atlases in the world was summarized at first, and follows with China's achievements in editing of the 1st and 2nd version of The National Atlases of China (NAC), which reflected, in multiple levels, China's development of science and technology, society and economy, resources and environment, etc. from 1950s to 1980s. From the previous edition of NAC, systematic theories and methods were summarized and concluded, including comprehensive and statistical mapping theory, designing principle of electronic atlases, and new method the technologies involved in NAC. Then, the New Century Edition of NAC is designed, including its orientation, technological system, volume arrangement, and key scientific and technological problems to be resolved.

  2. Parenting Practices and Child Misbehavior: A Mixed Method Study of Italian Mothers and Children

    PubMed Central

    Bombi, Anna Silvia; Di Norcia, Anna; Di Giunta, Laura; Pastorelli, Concetta; Lansford, Jennifer E.

    2015-01-01

    Objective The present study uses a mixed qualitative and quantitative method to examine three main research questions: What are the practices that mothers report they use when trying to correct their children’s misbehaviors? Are there common patterns of these practices? Are the patterns that emerge related to children’s well-being? Design Italian mother-child dyads (N=103) participated in the study (when children were 8 years of age). At Time 1 (T1), mothers answered open-ended questions about discipline; in addition, measures of maternal physical discipline and rejection and child aggression were assessed in mothers and children at T1, one year later (T2), and two years later (T3). Results Mothers’ answers to open-ended questions about what they would do in three disciplinary situations were classified in six categories: physical or psychological punishment, control, mix of force and reasoning, reasoning, listening, and permissiveness. Cluster analysis yielded 3 clusters: Group 1, Induction (predominant use of reasoning and listening; 74%); Group 2, Punishment (punitive practices and no reasoning; 16%); Group 3, Mixed practices (combination of reasoning and punishment, as well as high control and no listening; 10%). Multiple-group latent growth curves of maternal physical discipline, maternal rejection, and child aggression were implemented to evaluate possible differences in the developmental trends from T1 to T3, as a function of cluster. Conclusions Qualitative data deepen understanding of parenting because they shed light on what parents think about themselves; their self-descriptions, in turn, help to identify ways of parenting that may have long-lasting consequences for children’s adjustment. PMID:26877716

  3. Accident involvement among learner drivers--an analysis of the consequences of supervised practice.

    PubMed

    Gregersen, Nils Petter; Nyberg, Anders; Berg, Hans-Yngve

    2003-09-01

    It is a well-known fact that experience is important for safe driving. Previously, this presented a problem since experience was mostly gained during the most dangerous period of driving-the first years with a licence. In many countries, this "experience paradox" has been addressed by providing increased opportunities to gain experience through supervised practice. One question, however, which still needs to be answered is what has been lost and what has been gained through supervised practice. Does this method lead to fewer accidents after licensing and/or has the number of accidents in driving practice increased? There were three aims in the study. The first was to calculate the size of the accident problem in terms of the number of accidents, health risk and accident risk during practising. The second aim was to evaluate the solution of the "experience paradox" that supervised practice suggests by calculating the costs in terms of accidents during driving practice and the benefits in terms of reduced accident involvement after obtaining a licence. The third aim was to analyse conflict types that occur during driving practice. National register data on licence holders and police-reported injury accidents and self-reported exposure were used. The results show that during the period 1994-2000, 444 driving practice injury accidents were registered, compared to 13657 accidents during the first 2 years with a licence. The health risk during the period after licensing was 33 times higher and the accident risk 10 times higher than the corresponding risk during practice. The cost-benefit analysis showed that the benefits in terms of accident reduction after licensing were 30 times higher than the costs in terms of driving practice accidents. It is recommended that measures to reduce such accidents should focus on better education of the lay instructor, but not on introducing measures to reduce the amount of lay-instructed practice. PMID:12850073

  4. On Practical Results of the Differential Power Analysis

    NASA Astrophysics Data System (ADS)

    Breier, Jakub; Kleja, Marcel

    2012-03-01

    This paper describes practical differential power analysis attacks. There are presented successful and unsuccessful attack attempts with the description of the attack methodology. It provides relevant information about oscilloscope settings, optimization possibilities and fundamental attack principles, which are important when realizing this type of attack. The attack was conducted on the PIC18F2420 microcontroller, using the AES cryptographic algorithm in the ECB mode with the 128-bit key length. We used two implementations of this algorithm - in the C programming language and in the assembler.

  5. Analysis of flight equipment purchasing practices of representative air carriers

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The process through which representative air carriers decide whether or not to purchase flight equipment was investigated as well as their practices and policies in retiring surplus aircraft. An analysis of the flight equipment investment decision process in ten airlines shows that for the airline industry as a whole, the flight equipment investment decision is in a state of transition from a wholly informal process in earliest years to a much more organized and structured process in the future. Individual air carriers are in different stages with respect to the formality and sophistication associated with the flight equipment investment decision.

  6. Aural Image in Practice: A Multicase Analysis of Instrumental Practice in Middle School Learners

    ERIC Educational Resources Information Center

    Oare, Steve

    2016-01-01

    This multiple case study examined six adolescent band students engaged in self-directed practice. The students' practice sessions were videotaped. Students provided verbal reports during their practice and again retrospectively while reviewing their video immediately after practice. Students were asked to discuss their choice of practice…

  7. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  8. Triangle area water supply monitoring project, October 1988 through September 2001, North Carolina -- description of the water-quality network, sampling and analysis methods, and quality-assurance practices

    USGS Publications Warehouse

    Oblinger, Carolyn J.

    2004-01-01

    The Triangle Area Water Supply Monitoring Project was initiated in October 1988 to provide long-term water-quality data for six area water-supply reservoirs and their tributaries. In addition, the project provides data that can be used to determine the effectiveness of large-scale changes in water-resource management practices, document differences in water quality among water-supply types (large multiuse reservoir, small reservoir, run-of-river), and tributary-loading and in-lake data for water-quality modeling of Falls and Jordan Lakes. By September 2001, the project had progressed in four phases and included as many as 34 sites (in 1991). Most sites were sampled and analyzed by the U.S. Geological Survey. Some sites were already a part of the North Carolina Division of Water Quality statewide ambient water-quality monitoring network and were sampled by the Division of Water Quality. The network has provided data on streamflow, physical properties, and concentrations of nutrients, major ions, metals, trace elements, chlorophyll, total organic carbon, suspended sediment, and selected synthetic organic compounds. Project quality-assurance activities include written procedures for sample collection, record management and archive, collection of field quality-control samples (blank samples and replicate samples), and monitoring the quality of field supplies. In addition to project quality-assurance activities, the quality of laboratory analyses was assessed through laboratory quality-assurance practices and an independent laboratory quality-control assessment provided by the U.S. Geological Survey Branch of Quality Systems through the Blind Inorganic Sample Project and the Organic Blind Sample Project.

  9. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    DOEpatents

    Sanfilippo, Antonio P.; Cowell, Andrew J.; Gregory, Michelle L.; Baddeley, Robert L.; Paulson, Patrick R.; Tratz, Stephen C.; Hohimer, Ryan E.

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  10. Practical Methods for Locating Abandoned Wells in Populated Areas

    SciTech Connect

    Veloski, G.A.; Hammack, R.W.; Lynn, R.J.

    2007-09-01

    An estimated 12 million wells have been drilled during the 150 years of oil and gas production in the United States. Many old oil and gas fields are now populated areas where the presence of improperly plugged wells may constitute a hazard to residents. Natural gas emissions from wells have forced people from their houses and businesses and have caused explosions that injured or killed people and destroyed property. To mitigate this hazard, wells must be located and properly plugged, a task made more difficult by the presence of houses, businesses, and associated utilities. This paper describes well finding methods conducted by the National Energy Technology Laboratory (NETL) that were effective at two small towns in Wyoming and in a suburb of Pittsburgh, Pennsylvania.

  11. Practical optimization of Steiner trees via the cavity method

    NASA Astrophysics Data System (ADS)

    Braunstein, Alfredo; Muntoni, Anna

    2016-07-01

    The optimization version of the cavity method for single instances, called Max-Sum, has been applied in the past to the minimum Steiner tree problem on graphs and variants. Max-Sum has been shown experimentally to give asymptotically optimal results on certain types of weighted random graphs, and to give good solutions in short computation times for some types of real networks. However, the hypotheses behind the formulation and the cavity method itself limit substantially the class of instances on which the approach gives good results (or even converges). Moreover, in the standard model formulation, the diameter of the tree solution is limited by a predefined bound, that affects both computation time and convergence properties. In this work we describe two main enhancements to the Max-Sum equations to be able to cope with optimization of real-world instances. First, we develop an alternative ‘flat’ model formulation that allows the relevant configuration space to be reduced substantially, making the approach feasible on instances with large solution diameter, in particular when the number of terminal nodes is small. Second, we propose an integration between Max-Sum and three greedy heuristics. This integration allows Max-Sum to be transformed into a highly competitive self-contained algorithm, in which a feasible solution is given at each step of the iterative procedure. Part of this development participated in the 2014 DIMACS Challenge on Steiner problems, and we report the results here. The performance on the challenge of the proposed approach was highly satisfactory: it maintained a small gap to the best bound in most cases, and obtained the best results on several instances in two different categories. We also present several improvements with respect to the version of the algorithm that participated in the competition, including new best solutions for some of the instances of the challenge.

  12. Methods for diagnosis of bile acid malabsorption in clinical practice.

    PubMed

    Vijayvargiya, Priya; Camilleri, Michael; Shin, Andrea; Saenger, Amy

    2013-10-01

    Altered concentrations of bile acid (BA) in the colon can cause diarrhea or constipation. More than 25% of patients with irritable bowel syndrome with diarrhea or chronic diarrhea in Western countries have BA malabsorption (BAM). As BAM is increasingly recognized, proper diagnostic methods are needed to help direct the most effective course of treatment for the chronic bowel dysfunction. We review the methodologies, advantages, and disadvantages of tools that directly measure BAM: the (14)C-glycocholate breath and stool test, the (75)selenium homotaurocholic acid test (SeHCAT), and measurements of 7 α-hydroxy-4-cholesten-3-one (C4) and fecal BAs. The (14)C-glycocholate test is laborious and no longer widely used. The (75)SeHCAT has been validated but is not available in the United States. Measurement of serum C4 is a simple and accurate method that can be used for most patients but requires further clinical validation. Assays to quantify fecal BA (total and individual levels) are technically cumbersome and not widely available. Regrettably, none of these tests are routinely available in the United States; assessment of the therapeutic effects of a BA binder is used as a surrogate for diagnosis of BAM. Recent data indicate the advantages to studying fecal excretion of individual BAs and their role in BAM; these could support the use of the fecal BA assay, compared with other tests. Measurement of fecal BA levels could become a routine addition to the measurement of fecal fat in patients with unexplained diarrhea. Availability ultimately determines whether the C4, SeHCAT, or fecal BA test is used; more widespread availability of such tests would enhance clinical management of these patients.

  13. 13C-based metabolic flux analysis: fundamentals and practice.

    PubMed

    Yang, Tae Hoon

    2013-01-01

    Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.

  14. Laboratory theory and methods for sediment analysis

    USGS Publications Warehouse

    Guy, Harold P.

    1969-01-01

    The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.

  15. A report on the CCNA 2007 professional practice analysis.

    PubMed

    Muckle, Timothy J; Apatov, Nathaniel M; Plaus, Karen

    2009-06-01

    The purpose of this column is to present the results of the 2007 Professional Practice Analysis (PPA) of the field of nurse anesthesia, conducted by the Council on Certification of Nurse Anesthetists. The PPA used survey and rating scale methodologies to collect data regarding the relative emphasis of various aspects of the nurse anesthesia knowledge domain and competencies. A total of 3,805 survey responses were analyzed using the Rasch rating scale model, which aggregates and transforms ordinal (rating scale) responses into linear measures of relative importance and frequency. Summaries of respondent demographics and educational and professional background are provided, as well as descriptions of how the survey results are used to develop test specifications. The results of this analysis provide evidence for the content outline and test specifications (content percentages) and thus serve as a basis of content validation for the National Certification Examination.

  16. Documenting Elementary Teachers' Sustainability of Instructional Practices: A Mixed Method Case Study

    NASA Astrophysics Data System (ADS)

    Cotner, Bridget A.

    School reform programs focus on making educational changes; however, research on interventions past the funded implementation phase to determine what was sustained is rarely done (Beery, Senter, Cheadle, Greenwald, Pearson, et al., 2005). This study adds to the research on sustainability by determining what instructional practices, if any, of the Teaching SMARTRTM professional development program that was implemented from 2005--2008 in elementary schools with teachers in grades third through eighth were continued, discontinued, or adapted five years post-implementation (in 2013). Specifically, this study sought to answer the following questions: What do teachers who participated in Teaching SMARTRTM and district administrators share about the sustainability of Teaching SMARTRTM practices in 2013? What teaching strategies do teachers who participated in the program (2005--2008) use in their science classrooms five years postimplementation (2013)? What perceptions about the roles of females in science, technology, engineering, and mathematics (STEM) do teachers who participated in the program (2005--2008) have five years later (2013)? And, What classroom management techniques do the teachers who participated in the program (2005--2008) use five years post implementation (2013)? A mixed method approach was used to answer these questions. Quantitative teacher survey data from 23 teachers who participated in 2008 and 2013 were analyzed in SAS v. 9.3. Descriptive statistics were reported and paired t-tests were conducted to determine mean differences by survey factors identified from an exploratory factor analysis, principal axis factoring, and parallel analysis conducted with teacher survey baseline data (2005). Individual teacher change scores (2008 and 2013) for identified factors were computed using the Reliable Change Index statistic. Qualitative data consisted of interviews with two district administrators and three teachers who responded to the survey in both

  17. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    PubMed Central

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or “one-shot,” in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

  18. Low hardness organisms: Culture methods, sensitivities, and practical applications

    SciTech Connect

    DaCruz, A.; DaCruz, N.; Bird, M.

    1995-12-31

    EPA Regulations require biomonitoring of permitted effluent and stormwater runoff. Several permit locations were studied, in Virginia, that have supply water and or stormwater runoff which ranges in hardness from 5--30 mg/L. Ceriodaphnia dubia (dubia) and Pimephales promelas (fathead minnow) were tested in reconstituted water with hardnesses from 5--30 mg/L. Results indicated osmotic stresses present in the acute tests with the fathead minnow as well as chronic tests for the dubia and the fathead minnow. Culture methods were developed for both organism types in soft (30 mg) reconstituted freshwater. Reproductivity and development for each organisms type meets or exceeds EPA testing requirements for moderately hard organisms. Sensitivities were measured over an 18 month interval using cadmium chloride as a reference toxicant. Additionally, sensitivities were charted in contrast with those of organisms cultured in moderately hard water. The comparison proved that the sensitivities of both the dubia and the fathead minnow cultured in 30 mg water increased, but were within two standard deviations of the organism sensitivities of those cultured in moderately hard water. Latitude for use of organisms cultured in 30 mg was documented for waters ranging in hardness from 10--100 mg/L with no acclimation period required. The stability of the organism sensitivity was also validated. The application was most helpful in stormwater runoff and in effluents where the hardness was 30 mg/L or less.

  19. Cross-Continental Reflections on Evaluation Practice: Methods, Use, and Valuing

    ERIC Educational Resources Information Center

    Kallemeyn, Leanne M.; Hall, Jori; Friche, Nanna; McReynolds, Clifton

    2015-01-01

    The evaluation theory tree typology reflects the following three components of evaluation practice: (a) methods, (b) use, and (c) valuing. The purpose of this study was to explore how evaluation practice is conceived as reflected in articles published in the "American Journal of Evaluation" ("AJE") and "Evaluation," a…

  20. A Method of Designing Practical Examinations to Match What Is Taught in Laboratory Activities.

    ERIC Educational Resources Information Center

    Stensvold, Mark S.; Wilson, John T.

    1993-01-01

    Proposes methods by which laboratory practical exams may be structured to assess outcomes from laboratory instruction. Presents eight general considerations for writing and using practical exams. Describes four example laboratory exams involving a box camera, circuit boxes, floating objects, and light. (MDH)

  1. Perceived Barriers and Facilitators to School Social Work Practice: A Mixed-Methods Study

    ERIC Educational Resources Information Center

    Teasley, Martell; Canifield, James P.; Archuleta, Adrian J.; Crutchfield, Jandel; Chavis, Annie McCullough

    2012-01-01

    Understanding barriers to practice is a growing area within school social work research. Using a convenience sample of 284 school social workers, this study replicates the efforts of a mixed-method investigation designed to identify barriers and facilitators to school social work practice within different geographic locations. Time constraints and…

  2. Autoethnography as a Method for Reflexive Research and Practice in Vocational Psychology

    ERIC Educational Resources Information Center

    McIlveen, Peter

    2008-01-01

    This paper overviews the qualitative research method of autoethnography and its relevance to research in vocational psychology and practice in career development. Autoethnography is a reflexive means by which the researcher-practitioner consciously embeds himself or herself in theory and practice, and by way of intimate autobiographic account,…

  3. Method of analysis and quality-assurance practices by the U.S. Geological Survey Organic Geochemistry Research Group; determination of geosmin and methylisoborneol in water using solid-phase microextraction and gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Zimmerman, L.R.; Ziegler, A.C.; Thurman, E.M.

    2002-01-01

    A method for the determination of two common odor-causing compounds in water, geosmin and 2-methylisoborneol, was modified and verified by the U.S. Geological Survey's Organic Geochemistry Research Group in Lawrence, Kansas. The optimized method involves the extraction of odor-causing compounds from filtered water samples using a divinylbenzene-carboxen-polydimethylsiloxane cross-link coated solid-phase microextraction (SPME) fiber. Detection of the compounds is accomplished using capillary-column gas chromatography/mass spectrometry (GC/MS). Precision and accuracy were demonstrated using reagent-water, surface-water, and ground-water samples. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 35 nanograms per liter ranged from 60 to 123 percent for geosmin and from 90 to 96 percent for 2-methylisoborneol. Method detection limits were 1.9 nanograms per liter for geosmin and 2.0 nanograms per liter for 2-methylisoborneol in 45-milliliter samples. Typically, concentrations of 30 and 10 nanograms per liter of geosmin and 2-methylisoborneol, respectively, can be detected by the general public. The calibration range for the method is equivalent to concentrations from 5 to 100 nanograms per liter without dilution. The method is valuable for acquiring information about the production and fate of these odor-causing compounds in water.

  4. Causal Moderation Analysis Using Propensity Score Methods

    ERIC Educational Resources Information Center

    Dong, Nianbo

    2012-01-01

    This paper is based on previous studies in applying propensity score methods to study multiple treatment variables to examine the causal moderator effect. The propensity score methods will be demonstrated in a case study to examine the causal moderator effect, where the moderators are categorical and continuous variables. Moderation analysis is an…

  5. Measuring solar reflectance Part II: Review of practical methods

    SciTech Connect

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul

    2010-05-14

    A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23{sup o}], and to within 0.02 for surface slopes up to 12:12 [45{sup o}]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R*{sub g,0}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R*{sub g,0} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R*{sub g,0} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R*{sub g,0} to within about 0.01.

  6. Measuring solar reflectance - Part II: Review of practical methods

    SciTech Connect

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul

    2010-09-15

    A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23 ], and to within 0.02 for surface slopes up to 12:12 [45 ]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R{sub g,0}{sup *}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R{sub g,0}{sup *} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R{sub g,0}{sup *} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R{sub g,0}{sup *} to within about 0.01. (author)

  7. Articulating current service development practices: a qualitative analysis of eleven mental health projects

    PubMed Central

    2014-01-01

    Background The utilisation of good design practices in the development of complex health services is essential to improving quality. Healthcare organisations, however, are often seriously out of step with modern design thinking and practice. As a starting point to encourage the uptake of good design practices, it is important to understand the context of their intended use. This study aims to do that by articulating current health service development practices. Methods Eleven service development projects carried out in a large mental health service were investigated through in-depth interviews with six operation managers. The critical decision method in conjunction with diagrammatic elicitation was used to capture descriptions of these projects. Stage-gate design models were then formed to visually articulate, classify and characterise different service development practices. Results Projects were grouped into three categories according to design process patterns: new service introduction and service integration; service improvement; service closure. Three common design stages: problem exploration, idea generation and solution evaluation - were then compared across the design process patterns. Consistent across projects were a top-down, policy-driven approach to exploration, underexploited idea generation and implementation-based evaluation. Conclusions This study provides insight into where and how good design practices can contribute to the improvement of current service development practices. Specifically, the following suggestions for future service development practices are made: genuine user needs analysis for exploration; divergent thinking and innovative culture for idea generation; and fail-safe evaluation prior to implementation. Better training for managers through partnership working with design experts and researchers could be beneficial. PMID:24438471

  8. A practical method of estimating standard error of age in the fission track dating method

    USGS Publications Warehouse

    Johnson, N.M.; McGee, V.E.; Naeser, C.W.

    1979-01-01

    A first-order approximation formula for the propagation of error in the fission track age equation is given by PA = C[P2s+P2i+P2??-2rPsPi] 1 2, where PA, Ps, Pi and P?? are the percentage error of age, of spontaneous track density, of induced track density, and of neutron dose, respectively, and C is a constant. The correlation, r, between spontaneous are induced track densities is a crucial element in the error analysis, acting generally to improve the standard error of age. In addition, the correlation parameter r is instrumental is specifying the level of neutron dose, a controlled variable, which will minimize the standard error of age. The results from the approximation equation agree closely with the results from an independent statistical model for the propagation of errors in the fission-track dating method. ?? 1979.

  9. Methods for Analysis of Outdoor Performance Data (Presentation)

    SciTech Connect

    Jordan, D.

    2011-02-01

    The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted to power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders--utility companies, integrators, investors, and scientists alike. Different methods to determine degradation rates and discrete versus continuous data are presented, and some general best practice methods are outlined. In addition, historical degradation rates and some preliminary analysis with respect to climate are given.

  10. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2002-01-01

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  11. A mixed methods exploration of the team and organizational factors that may predict new graduate nurse engagement in collaborative practice.

    PubMed

    Pfaff, Kathryn A; Baxter, Pamela E; Ploeg, Jenny; Jack, Susan M

    2014-03-01

    Although engagement in collaborative practice is reported to support the role transition and retention of new graduate (NG) nurses, it is not known how to promote collaborative practice among these nurses. This mixed methods study explored the team and organizational factors that may predict NG nurse engagement in collaborative practice. A total of 514 NG nurses from Ontario, Canada completed the Collaborative Practice Assessment Tool. Sixteen NG nurses participated in follow-up interviews. The team and organizational predictors of NG engagement in collaborative practice were as follows: satisfaction with the team (β = 0.278; p = 0.000), number of team strategies (β = 0.338; p = 0.000), participation in a mentorship or preceptorship experience (β = 0.137; p = 0.000), accessibility of manager (β = 0.123; p = 0.001), and accessibility and proximity of educator or professional practice leader (β = 0.126; p = 0.001 and β = 0.121; p = 0.002, respectively). Qualitative analysis revealed the team facilitators to be respect, team support and face-to-face interprofessional interactions. Organizational facilitators included supportive leadership, participation in a preceptorship or mentorship experience and time. Interventions designed to facilitate NG engagement in collaborative practice should consider these factors. PMID:24195680

  12. Imaging Laser Analysis of Building MATERIALS—PRACTICAL Examples

    NASA Astrophysics Data System (ADS)

    Wilsch, G.; Schaurich, D.; Wiggenhauser, H.

    2011-06-01

    The Laser induced Breakdown Spectroscopy (LIBS) is supplement and extension of standard chemical methods and SEM- or Micro-RFA-applications for the evaluation of building materials. As a laboratory method LIBS is used to gain color coded images representing composition, distribution of characteristic ions and/or ingress characteristic of damaging substances. To create a depth profile of element concentration a core has to be taken and split along the core axis. LIBS was proven to be able to detect all important elements in concrete, e. g. Chlorine, Sodium or Sulfur, which are responsible for certain degradation mechanisms and also light elements like lithium or hydrogen. Practical examples are given and a mobile system for on-site measurements is presented.

  13. Imaging laser analysis of building materials - practical examples

    SciTech Connect

    Wilsch, G.; Schaurich, D.; Wiggenhauser, H.

    2011-06-23

    The Laser induced Breakdown Spectroscopy (LIBS) is supplement and extension of standard chemical methods and SEM- or Micro-RFA-applications for the evaluation of building materials. As a laboratory method LIBS is used to gain color coded images representing composition, distribution of characteristic ions and/or ingress characteristic of damaging substances. To create a depth profile of element concentration a core has to be taken and split along the core axis. LIBS was proven to be able to detect all important elements in concrete, e. g. Chlorine, Sodium or Sulfur, which are responsible for certain degradation mechanisms and also light elements like lithium or hydrogen. Practical examples are given and a mobile system for on-site measurements is presented.

  14. Analysis of reticulocyte counts using various methods.

    PubMed

    McKenzie, S B; Gauger, C A

    1991-01-01

    The precision and accuracy of manual reticulocyte counts using the Miller disc reticle, other ruled reticle and no reticle are compared with the reticulocyte results from the automated Hematrak 590 instrument. Two slides of each of 50 patient blood specimens were sent to the hematology laboratories of each of six participating hospitals. In addition to between-method comparison (precision), the manual method results using the three different counting techniques were each compared with the Hematrak results to determine if there were significant differences in reported results (accuracy). Statistical analysis revealed that the Miller disc method was the most precise and accurate manual method as compared with the Hematrak. Methods without a Miller disc reported significantly higher reticulocyte counts. Imprecision was also higher among non-Miller manual methods. By using the Miller disc, the accuracy and precision of manual methods may be increased to that of the automated Hematrak method. PMID:10149411

  15. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  16. Professional Suitability for Social Work Practice: A Factor Analysis

    ERIC Educational Resources Information Center

    Tam, Dora M. Y.; Coleman, Heather; Boey, Kam-Wing

    2012-01-01

    Objective: The purpose of this study was to identify the underlying dimensions of professional suitability. Method: Data were collected from a province-wide mail-out questionnaire surveying 341 participants from a random sample of registered social workers. Results: The use of an exploratory factor analysis identified a 5-factor solution on…

  17. Dynamic mechanical analysis: A practical introduction to techniques and applications

    SciTech Connect

    Menard, K.

    1999-01-01

    This book introduces DMA, its history, and its current position as part of thermal analysis on polymers. It discusses major types of instrumentation, including oscillatory rotational, oscillatory axial, and torsional pendulum. It also describes analytical techniques in terms of utility, quality of data, methods of calibration, and suitability for different types of materials and assesses applications for thermoplastics, thermosetting systems, and thermosets.

  18. Lipidomic data analysis: tutorial, practical guidelines and applications.

    PubMed

    Checa, Antonio; Bedia, Carmen; Jaumot, Joaquim

    2015-07-23

    Lipids are a broad group of biomolecules involved in diverse critical biological roles such as cellular membrane structure, energy storage or cell signaling and homeostasis. Lipidomics is the -omics science that pursues the comprehensive characterization of lipids present in a biological sample. Different analytical strategies such as nuclear magnetic resonance or mass spectrometry with or without previous chromatographic separation are currently used to analyze the lipid composition of a sample. However, current analytical techniques provide a vast amount of data which complicates the interpretation of results without the use of advanced data analysis tools. The choice of the appropriate chemometric method is essential to extract valuable information from the crude data as well as to interpret the lipidomic results in the biological context studied. The present work summarizes the diverse methods of analysis than can be used to study lipidomic data, from statistical inference tests to more sophisticated multivariate analysis methods. In addition to the theoretical description of the methods, application of various methods to a particular lipidomic data set as well as literature examples are presented.

  19. Lipidomic data analysis: tutorial, practical guidelines and applications.

    PubMed

    Checa, Antonio; Bedia, Carmen; Jaumot, Joaquim

    2015-07-23

    Lipids are a broad group of biomolecules involved in diverse critical biological roles such as cellular membrane structure, energy storage or cell signaling and homeostasis. Lipidomics is the -omics science that pursues the comprehensive characterization of lipids present in a biological sample. Different analytical strategies such as nuclear magnetic resonance or mass spectrometry with or without previous chromatographic separation are currently used to analyze the lipid composition of a sample. However, current analytical techniques provide a vast amount of data which complicates the interpretation of results without the use of advanced data analysis tools. The choice of the appropriate chemometric method is essential to extract valuable information from the crude data as well as to interpret the lipidomic results in the biological context studied. The present work summarizes the diverse methods of analysis than can be used to study lipidomic data, from statistical inference tests to more sophisticated multivariate analysis methods. In addition to the theoretical description of the methods, application of various methods to a particular lipidomic data set as well as literature examples are presented. PMID:26231889

  20. Bioanalytical methods for food contaminant analysis.

    PubMed

    Van Emon, Jeanette M

    2010-01-01

    Foods are complex mixtures of lipids, carbohydrates, proteins, vitamins, organic compounds, and other naturally occurring substances. Sometimes added to this mixture are residues of pesticides, veterinary and human drugs, microbial toxins, preservatives, contaminants from food processing and packaging, and other residues. This milieu of compounds can pose difficulties in the analysis of food contaminants. There is an expanding need for rapid and cost-effective residue methods for difficult food matrixes to safeguard our food supply. Bioanalytical methods are established for many food contaminants such as mycotoxins and are the method of choice for many food allergens. Bioanalytical methods are often more cost-effective and sensitive than instrumental procedures. Recent developments in bioanalytical methods may provide more applications for their use in food analysis.

  1. Method and apparatus for ceramic analysis

    DOEpatents

    Jankowiak, Ryszard J.; Schilling, Chris; Small, Gerald J.; Tomasik, Piotr

    2003-04-01

    The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.

  2. Updating a Meta-Analysis of Intervention Research with Challenging Behaviour: Treatment Validity and Standards of Practice

    ERIC Educational Resources Information Center

    Harvey, Shane T.; Boer, Diana; Meyer, Luanna H.; Evans, Ian M.

    2009-01-01

    Background: This meta-analysis of interventions with challenging behaviour in children with disabilities updates a comprehensive meta-analysis that previously addressed reported standards of practice and effectiveness of different strategies. Method: Four effect-size algorithms were calculated for published intervention cases, and results analysed…

  3. Methods for impact analysis of shipping containers

    SciTech Connect

    Nelson, T.A.; Chun, R.C.

    1987-11-01

    This report reviews methods for performing impact stress analyses of shipping containers used to transport spent fuel. The three methods discussed are quasi-static, dynamic lumped parameter; and dynamic finite element. These methods are used by industry for performing impact analyses for Safety Analysis Reports. The approach for each method is described including assumptions and limitations and modeling considerations. The effects of uncertainties in the modeling and analyzing of casks are identified. Each of the methods uses linear elastic structural analysis principles. Methods for interfacing impact stresses with the design and load combinations criteria specified in Regulatory Guides 7.6 and 7.8 are outlined. The quasi-static method is based on D'Alembert's principle to substitute equivalent static forces for inertial forces created by the impact. The lumped parameter method is based on using a discrete number of stiffness elements and masses to represent the cask during impact. The dynamic finite element method uses finite element techniques combined with time integration to analyze the cask impact. Each of these methods can provide an acceptable means, within certain limitations, for analyzing cask impact on unyielding surfaces. 25 refs., 23 figs.

  4. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2004-03-23

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  5. Advanced analysis methods in particle physics

    SciTech Connect

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  6. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2010-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  7. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2009-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  8. Recruitment ad analysis offers new opportunities to attract GPs to short-staffed practices.

    PubMed

    Hemphill, Elizabeth; Kulik, Carol T

    2013-01-01

    As baby-boomer practitioners exit the workforce, physician shortages present new recruitment challenges for practices seeking GPs. This article reports findings from two studies examining GP recruitment practice. GP recruitment ad content analysis (Study 1) demonstrated that both Internet and print ads emphasize job attributes but rarely present family or practice attributes. Contacts at these medical practices reported that their practices offer distinctive family and practice attributes that could be exploited in recruitment advertising (Study 2). Understaffed medical practices seeking to attract GPs may differentiate their job offerings in a crowded market by incorporating family and/or practice attributes into their ads. PMID:23697854

  9. Chromatographic methods for analysis of triazine herbicides.

    PubMed

    Abbas, Hana Hassan; Elbashir, Abdalla A; Aboul-Enein, Hassan Y

    2015-01-01

    Gas chromatography (GC) and high-performance liquid chromatography (HPLC) coupled to different detectors, and in combination with different sample extraction methods, are most widely used for analysis of triazine herbicides in different environmental samples. Nowadays, many variations and modifications of extraction and sample preparation methods such as solid-phase microextraction (SPME), hollow fiber-liquid phase microextraction (HF-LPME), stir bar sportive extraction (SBSE), headspace-solid phase microextraction (HS-SPME), dispersive liquid-liquid microextraction (DLLME), dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO), ultrasound-assisted emulsification microextraction (USAEME), and others have been introduced and developed to obtain sensitive and accurate methods for the analysis of these hazardous compounds. In this review, several analytical properties such as linearity, sensitivity, repeatability, and accuracy for each developed method are discussed, and excellent results were obtained for the most of developed methods combined with GC and HPLC techniques for the analysis of triazine herbicides. This review gives an overview of recent publications of the application of GC and HPLC for analysis of triazine herbicides residues in various samples.

  10. Design analysis, robust methods, and stress classification

    SciTech Connect

    Bees, W.J.

    1993-01-01

    This special edition publication volume is comprised of papers presented at the 1993 ASME Pressure Vessels and Piping Conference, July 25--29, 1993 in Denver, Colorado. The papers were prepared for presentations in technical sessions developed under the auspices of the PVPD Committees on Computer Technology, Design and Analysis, Operations Applications and Components. The topics included are: Analysis of Pressure Vessels and Components; Expansion Joints; Robust Methods; Stress Classification; and Non-Linear Analysis. Individual papers have been processed separately for inclusion in the appropriate data bases.

  11. ATMOS data processing and science analysis methods.

    PubMed

    Norton, R H; Rinsland, C P

    1991-02-01

    The ATMOS (atmospheric trace molecule spectroscopy) instrument, a high speed Fourier transform spectrometer operating in the middle IR (2.2-16 microm), recorded more than 1500 solar spectra at approximately 0.0105-cm(-1) resolution during its first mission onboard the shuttle Challenger in the spring of 1985. These spectra were acquired during high sun conditions for studies of the solar atmosphere and during low sun conditions for studies of the earth's upper atmosphere. This paper describes the steps by which the telemetry data were converted into spectra suitable for analysis, the analysis software and methods developed for the atmospheric and solar studies, and the ATMOS data analysis facility.

  12. Numerical analysis of the orthogonal descent method

    SciTech Connect

    Shokov, V.A.; Shchepakin, M.B.

    1994-11-01

    The author of the orthogonal descent method has been testing it since 1977. The results of these tests have only strengthened the need for further analysis and development of orthogonal descent algorithms for various classes of convex programming problems. Systematic testing of orthogonal descent algorithms and comparison of test results with other nondifferentiable optimization methods was conducted at TsEMI RAN in 1991-1992 using the results.

  13. [Analysis of an intercultural clinical practice in a judicial setting].

    PubMed

    Govindama, Yolande

    2007-01-01

    This article analyses an intercultural clinical practice in a legal setting from an anthropological and psychoanalytical perspective, demonstrating necessary reorganizations inherent to the framework. The culture of the new country and its founding myth being implicit to the judicial framework, the professional intervening introduces psychoanalytical references particularly totemic principles and the symbolic father by making genealogy, a universal object of transmission as guarantee of fundamental taboos of humanity. The metacultural perspective in this approach integrates ethnopsychoanalytical principles put forth by Devereux as well as the method although this latter has been adapted to the framework. This approach allows to re-question Devereux's ethnopsychoanalytical principles by opening the debate on the perspective of a psychoanalytical as well as psychiatric.

  14. [Disposal of drugs: an analysis of the practices in the family health program].

    PubMed

    Alencar, Tatiane de Oliveira Silva; Machado, Carla Silva Rocha; Costa, Sônia Carine Cova; Alencar, Bruno Rodrigues

    2014-07-01

    The scope of this article is to discuss the perception of health workers in relation to the disposal of drugs and analyze how this practice occurs in family health units in a city in the state of Bahia. It involved a qualitative and exploratory study together with nurses, nursing assistants, community health workers and pharmacists of Pharmaceutical Care and Health Surveillance. Semi-structured interviews were conducted with systematic observation and use of previously-drafted scripts and the content analysis method was used for data analysis. The results showed poor understanding regarding proper disposal among the workers, dissonant practices in the implementation of the regulations and a lack of communication between health surveillance and other health services. The creation of effective strategies must involve the whole process from management to the prescription and use of drugs and requires further political, economic and social participation.

  15. A practical guide to value of information analysis.

    PubMed

    Wilson, Edward C F

    2015-02-01

    Value of information analysis is a quantitative method to estimate the return on investment in proposed research projects. It can be used in a number of ways. Funders of research may find it useful to rank projects in terms of the expected return on investment from a variety of competing projects. Alternatively, trialists can use the principles to identify the efficient sample size of a proposed study as an alternative to traditional power calculations, and finally, a value of information analysis can be conducted alongside an economic evaluation as a quantitative adjunct to the 'future research' or 'next steps' section of a study write up. The purpose of this paper is to present a brief introduction to the methods, a step-by-step guide to calculation and a discussion of issues that arise in their application to healthcare decision making. Worked examples are provided in the accompanying online appendices as Microsoft Excel spreadsheets.

  16. Protein-protein interactions: methods for detection and analysis.

    PubMed Central

    Phizicky, E M; Fields, S

    1995-01-01

    The function and activity of a protein are often modulated by other proteins with which it interacts. This review is intended as a practical guide to the analysis of such protein-protein interactions. We discuss biochemical methods such as protein affinity chromatography, affinity blotting, coimmunoprecipitation, and cross-linking; molecular biological methods such as protein probing, the two-hybrid system, and phage display: and genetic methods such as the isolation of extragenic suppressors, synthetic mutants, and unlinked noncomplementing mutants. We next describe how binding affinities can be evaluated by techniques including protein affinity chromatography, sedimentation, gel filtration, fluorescence methods, solid-phase sampling of equilibrium solutions, and surface plasmon resonance. Finally, three examples of well-characterized domains involved in multiple protein-protein interactions are examined. The emphasis of the discussion is on variations in the approaches, concerns in evaluating the results, and advantages and disadvantages of the techniques. PMID:7708014

  17. Displacement Monitoring and Sensitivity Analysis in the Observational Method

    NASA Astrophysics Data System (ADS)

    Górska, Karolina; Muszyński, Zbigniew; Rybak, Jarosław

    2013-09-01

    This work discusses the fundamentals of designing deep excavation support by means of observational method. The effective tools for optimum designing with the use of the observational method are both inclinometric and geodetic monitoring, which provide data for the systematically updated calibration of the numerical computational model. The analysis included methods for selecting data for the design (by choosing the basic random variables), as well as methods for an on-going verification of the results of numeric calculations (e.g., MES) by way of measuring the structure displacement using geodetic and inclinometric techniques. The presented example shows the sensitivity analysis of the calculation model for a cantilever wall in non-cohesive soil; that analysis makes it possible to select the data to be later subject to calibration. The paper presents the results of measurements of a sheet pile wall displacement, carried out by means of inclinometric method and, simultaneously, two geodetic methods, successively with the deepening of the excavation. This work includes also critical comments regarding the usefulness of the obtained data, as well as practical aspects of taking measurement in the conditions of on-going construction works.

  18. A mixed methods study of food safety knowledge, practices and beliefs in Hispanic families with young children.

    PubMed

    Stenger, Kristen M; Ritter-Gooder, Paula K; Perry, Christina; Albrecht, Julie A

    2014-12-01

    Children are at a higher risk for foodborne illness. The objective of this study was to explore food safety knowledge, beliefs and practices among Hispanic families with young children (≤10 years of age) living within a Midwestern state. A convergent mixed methods design collected qualitative and quantitative data in parallel. Food safety knowledge surveys were administered (n = 90) prior to exploration of beliefs and practices among six focus groups (n = 52) conducted by bilingual interpreters in community sites in five cities/towns. Descriptive statistics determined knowledge scores and thematic coding unveiled beliefs and practices. Data sets were merged to assess concordance. Participants were female (96%), 35.7 (±7.6) years of age, from Mexico (69%), with the majority having a low education level. Food safety knowledge was low (56% ± 11). Focus group themes were: Ethnic dishes popular, Relating food to illness, Fresh food in home country, Food safety practices, and Face to face learning. Mixed method analysis revealed high self confidence in preparing food safely with low safe food handling knowledge and the presence of some cultural beliefs. On-site Spanish classes and materials were preferred venues for food safety education. Bilingual food safety messaging targeting common ethnic foods and cultural beliefs and practices is indicated to lower the risk of foodborne illness in Hispanic families with young children.

  19. Beginning secondary science teachers' classroom roles and instructional methods: An exploratory study of conflicts within practical theories

    NASA Astrophysics Data System (ADS)

    Rearden, Kristin Theresa

    There are a myriad of factors which influence a teacher' s classroom behaviors. Taken together, these factors are referred to as a teacher's practical theory. Some of the elements of practical theories are perceptions regarding classroom role, impressions of student abilities, reflection on experiences, and content knowledge. First-year teachers, or beginning teachers, are faced with many new challenges as they embark on their endeavor to facilitate the learning of their students. The congruence of the elements within their practical theories of teaching can provide the foundation for consistency within their classroom practices. The researcher investigated two aspects of the practical theories of beginning secondary science teachers. The first aspect was teachers' perceptions of their roles in the classroom The second aspect was teachers' intended instructional methods. Interview data from 27 beginning secondary science teachers who earned their teacher certification from one of three institutions were used for the study. The interviews were analyzed for information regarding the aforementioned aspects. An interview theme analysis (Hewson, Kerby, & Cook, 1995) was completed for each teacher. The characterization of each teacher's role was based on three categories outlined by Fenstermacher and Soltis (1986): Executive, Therapist, and Liberationist. In describing their classroom role, most of the teachers alluded to an Executive-type approach to teaching, in which their concerns regarding conveyance of content, processes or skills were paramount. In many cases, they mentioned the use of more than one instructional method; topics and variability in student learning styles accounted for the implementation of multiple methods. Methods usually included activities or hands-on experiences. Some teachers mentioned a certain "feel" of the classroom that was necessary for student learning. More than two-thirds of the teachers either expressed conflicts in their interview or

  20. A practical guide to environmental association analysis in landscape genomics.

    PubMed

    Rellstab, Christian; Gugerli, Felix; Eckert, Andrew J; Hancock, Angela M; Holderegger, Rolf

    2015-09-01

    Landscape genomics is an emerging research field that aims to identify the environmental factors that shape adaptive genetic variation and the gene variants that drive local adaptation. Its development has been facilitated by next-generation sequencing, which allows for screening thousands to millions of single nucleotide polymorphisms in many individuals and populations at reasonable costs. In parallel, data sets describing environmental factors have greatly improved and increasingly become publicly accessible. Accordingly, numerous analytical methods for environmental association studies have been developed. Environmental association analysis identifies genetic variants associated with particular environmental factors and has the potential to uncover adaptive patterns that are not discovered by traditional tests for the detection of outlier loci based on population genetic differentiation. We review methods for conducting environmental association analysis including categorical tests, logistic regressions, matrix correlations, general linear models and mixed effects models. We discuss the advantages and disadvantages of different approaches, provide a list of dedicated software packages and their specific properties, and stress the importance of incorporating neutral genetic structure in the analysis. We also touch on additional important aspects such as sampling design, environmental data preparation, pooled and reduced-representation sequencing, candidate-gene approaches, linearity of allele-environment associations and the combination of environmental association analyses with traditional outlier detection tests. We conclude by summarizing expected future directions in the field, such as the extension of statistical approaches, environmental association analysis for ecological gene annotation, and the need for replication and post hoc validation studies.

  1. Methods to enhance compost practices as an alternative to waste disposal

    SciTech Connect

    Stuckey, H.T.; Hudak, P.F.

    1998-12-31

    Creating practices that are ecologically friendly, economically profitable, and ethically sound is a concept that is slowly beginning to unfold in modern society. In developing such practices, the authors challenge long-lived human behavior patterns and environmental management practices. In this paper, they trace the history of human waste production, describe problems associated with such waste, and explore regional coping mechanisms. Composting projects in north central Texas demonstrate new methods for waste disposal. The authors studied projects conducted by municipalities, schools, agricultural organizations, and individual households. These efforts were examined within the context of regional and statewide solid waste plans. They conclude that: (1) regional composting in north central Texas will substantially reduce the waste stream entering landfills; (2) public education is paramount to establishing alternative waste disposal practices; and (3) new practices for compost will catalyze widespread and efficient production.

  2. Organizational climate and hospital nurses' caring practices: a mixed-methods study.

    PubMed

    Roch, Geneviève; Dubois, Carl-Ardy; Clarke, Sean P

    2014-06-01

    Organizational climate in healthcare settings influences patient outcomes, but its effect on nursing care delivery remains poorly understood. In this mixed-methods study, nurse surveys (N = 292) were combined with a qualitative case study of 15 direct-care registered nurses (RNs), nursing personnel, and managers. Organizational climate explained 11% of the variation in RNs' reported frequency of caring practices. Qualitative data suggested that caring practices were affected by the interplay of organizational climate dimensions with patients and nurses characteristics. Workload intensity and role ambiguity led RNs to leave many caring practices to practical nurses and assistive personnel. Systemic interventions are needed to improve organizational climate and to support RNs' involvement in a full range of caring practices.

  3. Preparing students to practice evidence-based dentistry: a mixed methods conceptual framework for curriculum enhancement.

    PubMed

    Palcanis, Kent G; Geiger, Brian F; O'Neal, Marcia R; Ivankova, Nataliya V; Evans, Retta R; Kennedy, Lasonja B; Carera, Karen W

    2012-12-01

    This article describes a mixed methods conceptual framework for evidence-based dentistry to enhance the curriculum at the University of Alabama at Birmingham School of Dentistry. A focus of recent curriculum reform has been to prepare students to integrate evidence-based dentistry into clinical practice. The authors developed a framework consisting of four conceptual phases to introduce curriculum innovation: 1) exploration of the phenomenon; 2) development of two new instruments; 3) data collection, analysis, outcomes, and evaluation; and 4) application to curricular reform. Eight sequential procedural steps (literature review; focus group discussions; development of themes; survey design; internal review; data collection, analysis, and evaluation; development of recommendations with external review; and implementation of recommendations for curricular enhancement) guided the curricular enhancement. Faculty members supported the concept of teaching evidence-based dentistry to facilitate major curriculum reform, and course directors incorporated evidence-based teaching to prepare scientist-practitioners who meet dental performance standards. The new curriculum implemented following completion of the study is in its third year. Much of its structure is based on evidence-based teaching methodologies, and approximately one-third of the content consists of small groups researching clinical problems with applied science and discussing the findings. The framework described in this article proved useful to guide revision of predoctoral clinical education at one dental school and may be useful in other settings.

  4. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, L.M.; Ng, E.G.

    1998-09-29

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

  5. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M.; Ng, Esmond G.

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  6. Systems and methods for sample analysis

    SciTech Connect

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-10-20

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  7. Systems and methods for sample analysis

    DOEpatents

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-01-13

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  8. Methods for Chemical Analysis of Fresh Waters.

    ERIC Educational Resources Information Center

    Golterman, H. L.

    This manual, one of a series prepared for the guidance of research workers conducting studies as part of the International Biological Programme, contains recommended methods for the analysis of fresh water. The techniques are grouped in the following major sections: Sample Taking and Storage; Conductivity, pH, Oxidation-Reduction Potential,…

  9. Research Methods Textbooks: An Objective Analysis.

    ERIC Educational Resources Information Center

    Jackson, Sherri L.; Lugo, Susan M.; Griggs, Richard A.

    2001-01-01

    Presents an analysis of undergraduate methods course textbooks (n=26) published in the United States with copyright dates from 1995-1999. Examines aspects of the textbooks, such as demographic qualities, use of pedagogical aids and illustrative material, and topic coverage. Includes the results in detail. (CMK)

  10. Multiple predictor smoothing methods for sensitivity analysis.

    SciTech Connect

    Helton, Jon Craig; Storlie, Curtis B.

    2006-08-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  11. Contraceptive Method Initiation: Using the Centers for Disease Control and Prevention Selected Practice Guidelines.

    PubMed

    Wu, Wan-Ju; Edelman, Alison

    2015-12-01

    The US Selected Practice Recommendations is a companion document to the Medical Eligibility Criteria for Contraceptive Use that focuses on how providers can use contraceptive methods most effectively as well as problem-solve common issues that may arise. These guidelines serve to help clinicians provide contraception safely as well as to decrease barriers that prevent or delay a woman from obtaining a desired method. This article summarizes the Selected Practice Recommendations on timing of contraceptive initiation, examinations, and tests needed prior to starting a method and any necessary follow-up.

  12. "Movement Doesn't Lie": Teachers' Practice Choreutical Analysis

    ERIC Educational Resources Information Center

    Pastore, Serafina; Pentassuglia, Monica

    2015-01-01

    Identifying and describing teaching practice is not an easy task. Current educational research aims at explaining teachers' work focusing on the concept of practice. Teachers' practical knowledge is a sensitive and tacit knowledge, produced, and effused by the body. In this perspective, the teachers' work can be considered as an expressive…

  13. Practical Recommendations to Improve the Quality of Training and Methodical Support of Professional Teacher Education

    ERIC Educational Resources Information Center

    Grebennikov, Valery V.; Grudtsina, Ludmila Yu.; Marchuk, Nikolay N.; Sangadgiev, Badma V.; Kudyashev, Nail K.

    2016-01-01

    The research urgency is caused by the transition to the knowledge society and new demands for training and methodical provision of professional pedagogical education. The purpose of this paper is to develop practical recommendations to improve the quality of training and methodical support of professional pedagogical education. The leading…

  14. Best Practices in Teaching Statistics and Research Methods in the Behavioral Sciences [with CD-ROM

    ERIC Educational Resources Information Center

    Dunn, Dana S., Ed.; Smith, Randolph A., Ed.; Beins, Barney, Ed.

    2007-01-01

    This book provides a showcase for "best practices" in teaching statistics and research methods in two- and four-year colleges and universities. A helpful resource for teaching introductory, intermediate, and advanced statistics and/or methods, the book features coverage of: (1) ways to integrate these courses; (2) how to promote ethical conduct;…

  15. A Survey of Functional Behavior Assessment Methods Used by Behavior Analysts in Practice

    ERIC Educational Resources Information Center

    Oliver, Anthony C.; Pratt, Leigh A.; Normand, Matthew P.

    2015-01-01

    To gather information about the functional behavior assessment (FBA) methods behavior analysts use in practice, we sent a web-based survey to 12,431 behavior analysts certified by the Behavior Analyst Certification Board. Ultimately, 724 surveys were returned, with the results suggesting that most respondents regularly use FBA methods, especially…

  16. New Analysis Methods In Photon Correlation Spectroscopy

    NASA Astrophysics Data System (ADS)

    Nash, P. J.; King, T. A.

    1983-06-01

    This paper describes the analysis of photon correlation spectroscopy decay curves by a significant new method based on the fitting of sums of positive exponentials by the S-exponential sum fitting method. The method fits a positive exponential sum to a given data set providing a best weighted least squares fit. No initial setting of any of the parameters is required and the number of exponential coefficients does not have to be preset in the program but is determined by the number of components apparent above the noise level. Results will be discussed for application in scattering systems which may be single or multiple component. Systems generating single, double or multiple exponential decay functions derived from computer simulation or photon correlation exneriments are considered and fitting analysis with varying noise levels.

  17. Practical Performance Analysis for Multiple Information Fusion Based Scalable Localization System Using Wireless Sensor Networks.

    PubMed

    Zhao, Yubin; Li, Xiaofan; Zhang, Sha; Meng, Tianhui; Zhang, Yiwen

    2016-08-23

    In practical localization system design, researchers need to consider several aspects to make the positioning efficiently and effectively, e.g., the available auxiliary information, sensing devices, equipment deployment and the environment. Then, these practical concerns turn out to be the technical problems, e.g., the sequential position state propagation, the target-anchor geometry effect, the Non-line-of-sight (NLOS) identification and the related prior information. It is necessary to construct an efficient framework that can exploit multiple available information and guide the system design. In this paper, we propose a scalable method to analyze system performance based on the Cramér-Rao lower bound (CRLB), which can fuse all of the information adaptively. Firstly, we use an abstract function to represent all of the wireless localization system model. Then, the unknown vector of the CRLB consists of two parts: the first part is the estimated vector, and the second part is the auxiliary vector, which helps improve the estimation accuracy. Accordingly, the Fisher information matrix is divided into two parts: the state matrix and the auxiliary matrix. Unlike the theoretical analysis, our CRLB can be a practical fundamental limit to denote the system that fuses multiple information in the complicated environment, e.g., recursive Bayesian estimation based on the hidden Markov model, the map matching method and the NLOS identification and mitigation methods. Thus, the theoretical results are approaching the real case more. In addition, our method is more adaptable than other CRLBs when considering more unknown important factors. We use the proposed method to analyze the wireless sensor network-based indoor localization system. The influence of the hybrid LOS/NLOS channels, the building layout information and the relative height differences between the target and anchors are analyzed. It is demonstrated that our method exploits all of the available information for

  18. Practical Performance Analysis for Multiple Information Fusion Based Scalable Localization System Using Wireless Sensor Networks.

    PubMed

    Zhao, Yubin; Li, Xiaofan; Zhang, Sha; Meng, Tianhui; Zhang, Yiwen

    2016-01-01

    In practical localization system design, researchers need to consider several aspects to make the positioning efficiently and effectively, e.g., the available auxiliary information, sensing devices, equipment deployment and the environment. Then, these practical concerns turn out to be the technical problems, e.g., the sequential position state propagation, the target-anchor geometry effect, the Non-line-of-sight (NLOS) identification and the related prior information. It is necessary to construct an efficient framework that can exploit multiple available information and guide the system design. In this paper, we propose a scalable method to analyze system performance based on the Cramér-Rao lower bound (CRLB), which can fuse all of the information adaptively. Firstly, we use an abstract function to represent all of the wireless localization system model. Then, the unknown vector of the CRLB consists of two parts: the first part is the estimated vector, and the second part is the auxiliary vector, which helps improve the estimation accuracy. Accordingly, the Fisher information matrix is divided into two parts: the state matrix and the auxiliary matrix. Unlike the theoretical analysis, our CRLB can be a practical fundamental limit to denote the system that fuses multiple information in the complicated environment, e.g., recursive Bayesian estimation based on the hidden Markov model, the map matching method and the NLOS identification and mitigation methods. Thus, the theoretical results are approaching the real case more. In addition, our method is more adaptable than other CRLBs when considering more unknown important factors. We use the proposed method to analyze the wireless sensor network-based indoor localization system. The influence of the hybrid LOS/NLOS channels, the building layout information and the relative height differences between the target and anchors are analyzed. It is demonstrated that our method exploits all of the available information for

  19. Practical Performance Analysis for Multiple Information Fusion Based Scalable Localization System Using Wireless Sensor Networks

    PubMed Central

    Zhao, Yubin; Li, Xiaofan; Zhang, Sha; Meng, Tianhui; Zhang, Yiwen

    2016-01-01

    In practical localization system design, researchers need to consider several aspects to make the positioning efficiently and effectively, e.g., the available auxiliary information, sensing devices, equipment deployment and the environment. Then, these practical concerns turn out to be the technical problems, e.g., the sequential position state propagation, the target-anchor geometry effect, the Non-line-of-sight (NLOS) identification and the related prior information. It is necessary to construct an efficient framework that can exploit multiple available information and guide the system design. In this paper, we propose a scalable method to analyze system performance based on the Cramér–Rao lower bound (CRLB), which can fuse all of the information adaptively. Firstly, we use an abstract function to represent all of the wireless localization system model. Then, the unknown vector of the CRLB consists of two parts: the first part is the estimated vector, and the second part is the auxiliary vector, which helps improve the estimation accuracy. Accordingly, the Fisher information matrix is divided into two parts: the state matrix and the auxiliary matrix. Unlike the theoretical analysis, our CRLB can be a practical fundamental limit to denote the system that fuses multiple information in the complicated environment, e.g., recursive Bayesian estimation based on the hidden Markov model, the map matching method and the NLOS identification and mitigation methods. Thus, the theoretical results are approaching the real case more. In addition, our method is more adaptable than other CRLBs when considering more unknown important factors. We use the proposed method to analyze the wireless sensor network-based indoor localization system. The influence of the hybrid LOS/NLOS channels, the building layout information and the relative height differences between the target and anchors are analyzed. It is demonstrated that our method exploits all of the available information for

  20. [Principles and methods of good practice for the translation process for instruments of nursing research and nursing practice].

    PubMed

    Martin, Jacqueline S; Vincenzi, Christine; Spirig, Rebecca

    2007-06-01

    Cross-cultural, valid and reliable instruments are increasingly used in nursing, yet their attainment is labour-intensive. The translation of valid instruments into another language and cultural context is the most common method used for generating cross-cultural instruments. This approach is challenging since the development of culturally equivalent translated instruments demands familiarity with basic requirements of linguistic adaptation, cultural concepts and psychometric changes inherent in the translation process. However, the quality of data derived from translated instruments relies on the accuracy of the translation process applied. The aim of this article is to illustrate the different methods for translation, as well as to present an example and principles of good practice regarding this subject.

  1. Improving educational environment in medical colleges through transactional analysis practice of teachers

    PubMed Central

    Rajan, Marina

    2012-01-01

    Context: A FAIMER (Foundation for Advancement in International Medical Education and Research) fellow organized a comprehensive faculty development program to improve faculty awareness resulting in changed teaching practices and better teacher student relationships using Transactional Analysis (TA). Practicing TA tools help development of ‘awareness’ about intrapersonal and interpersonal processes. Objectives: To improve self-awareness among medical educators.To bring about self-directed change in practices among medical educators.To assess usefulness of TA tools for the same. Methods: An experienced trainer conducted a basic course (12 hours) in TA for faculty members. The PAC model of personality structure, functional fluency model of personal functioning, stroke theory on motivation, passivity and script theories of adult functional styles were taught experientially with examples from the Medical Education Scenario. Self-reported improvement in awareness and changes in practices were assessed immediately after, at three months, and one year after training. Findings: The mean improvement in self-'awareness' is 13.3% (95% C.I 9.3-17.2) among nineteen participants. This persists one year after training. Changes in practices within a year include, collecting feedback, new teaching styles and better relationship with students. Discussion and Conclusions: These findings demonstrate sustainable and measurable improvement in self-awareness by practice of TA tools. Improvement in self-'awareness' of faculty resulted in self-directed changes in teaching practices. Medical faculty has judged the TA tools effective for improving self-awareness leading to self-directed changes. PMID:24358808

  2. Consumer food safety knowledge, practices, and demographic differences: findings from a meta-analysis.

    PubMed

    Patil, Sumeet R; Cates, Sheryl; Morales, Roberta

    2005-09-01

    Risk communication and consumer education to promote safer handling of food can be the best way of managing the risk of foodborne illness at the consumer end of the food chain. Thus, an understanding of the overall status of food handling knowledge and practices is needed. Although traditional qualitative reviews can be used for combining information from several studies on specific food handling behaviors, a structured approach of meta-analysis can be more advantageous in a holistic assessment. We combined findings from 20 studies using meta-analysis methods to estimate percentages of consumers engaging in risky behaviors, such as consumption of raw food, poor hygiene, and cross-contamination, separated by various demographic categories. We estimated standard errors to reflect sampling error and between-study random variation. Then we evaluated the statistical significance of differences in behaviors across demographic categories and across behavioral measures. There were considerable differences in behaviors across demographic categories, possibly because of socioeconomic and cultural differences. For example, compared with women, men reported greater consumption of raw or undercooked foods, poorer hygiene, poorer practices to prevent cross-contamination, and less safe defrosting practices. Mid-age adults consumed more raw food (except milk) than did young adults and seniors. High-income individuals reported greater consumption of raw foods, less knowledge of hygiene, and poorer cross-contamination practices. The highest raw ground beef and egg consumption and the poorest hygiene and cross-contamination practices were found in the U.S. Mountain region. Meta-analysis was useful for identifying important data gaps and demographic groups with risky behaviors, and this information can be used to prioritize further research.

  3. Power System Transient Stability Analysis through a Homotopy Analysis Method

    SciTech Connect

    Wang, Shaobu; Du, Pengwei; Zhou, Ning

    2014-04-01

    As an important function of energy management systems (EMSs), online contingency analysis plays an important role in providing power system security warnings of instability. At present, N-1 contingency analysis still relies on time-consuming numerical integration. To save computational cost, the paper proposes a quasi-analytical method to evaluate transient stability through time domain periodic solutions’ frequency sensitivities against initial values. First, dynamic systems described in classical models are modified into damping free systems whose solutions are either periodic or expanded (non-convergent). Second, because the sensitivities experience sharp changes when periodic solutions vanish and turn into expanded solutions, transient stability is assessed using the sensitivity. Third, homotopy analysis is introduced to extract frequency information and evaluate the sensitivities only from initial values so that time consuming numerical integration is avoided. Finally, a simple case is presented to demonstrate application of the proposed method, and simulation results show that the proposed method is promising.

  4. Cask crush pad analysis using detailed and simplified analysis methods

    SciTech Connect

    Uldrich, E.D.; Hawkes, B.D.

    1997-12-31

    A crush pad has been designed and analyzed to absorb the kinetic energy of a hypothetically dropped spent nuclear fuel shipping cask into a 44-ft. deep cask unloading pool at the Fluorinel and Storage Facility (FAST). This facility, located at the Idaho Chemical Processing Plant (ICPP) at the Idaho national Engineering and Environmental Laboratory (INEEL), is a US Department of Energy site. The basis for this study is an analysis by Uldrich and Hawkes. The purpose of this analysis was to evaluate various hypothetical cask drop orientations to ensure that the crush pad design was adequate and the cask deceleration at impact was less than 100 g. It is demonstrated herein that a large spent fuel shipping cask, when dropped onto a foam crush pad, can be analyzed by either hand methods or by sophisticated dynamic finite element analysis using computer codes such as ABAQUS. Results from the two methods are compared to evaluate accuracy of the simplified hand analysis approach.

  5. Structural sensitivity analysis: Methods, applications and needs

    NASA Technical Reports Server (NTRS)

    Adelman, H. M.; Haftka, R. T.; Camarda, C. J.; Walsh, J. L.

    1984-01-01

    Innovative techniques applicable to sensitivity analysis of discretized structural systems are reviewed. The techniques include a finite difference step size selection algorithm, a method for derivatives of iterative solutions, a Green's function technique for derivatives of transient response, simultaneous calculation of temperatures and their derivatives, derivatives with respect to shape, and derivatives of optimum designs with respect to problem parameters. Computerized implementations of sensitivity analysis and applications of sensitivity derivatives are also discussed. Some of the critical needs in the structural sensitivity area are indicated along with plans for dealing with some of those needs.

  6. Structural sensitivity analysis: Methods, applications, and needs

    NASA Technical Reports Server (NTRS)

    Adelman, H. M.; Haftka, R. T.; Camarda, C. J.; Walsh, J. L.

    1984-01-01

    Some innovative techniques applicable to sensitivity analysis of discretized structural systems are reviewed. These techniques include a finite-difference step-size selection algorithm, a method for derivatives of iterative solutions, a Green's function technique for derivatives of transient response, a simultaneous calculation of temperatures and their derivatives, derivatives with respect to shape, and derivatives of optimum designs with respect to problem parameters. Computerized implementations of sensitivity analysis and applications of sensitivity derivatives are also discussed. Finally, some of the critical needs in the structural sensitivity area are indicated along with Langley plans for dealing with some of these needs.

  7. Advanced Analysis Methods in High Energy Physics

    SciTech Connect

    Pushpalatha C. Bhat

    2001-10-03

    During the coming decade, high energy physics experiments at the Fermilab Tevatron and around the globe will use very sophisticated equipment to record unprecedented amounts of data in the hope of making major discoveries that may unravel some of Nature's deepest mysteries. The discovery of the Higgs boson and signals of new physics may be around the corner. The use of advanced analysis techniques will be crucial in achieving these goals. The author discusses some of the novel methods of analysis that could prove to be particularly valuable for finding evidence of any new physics, for improving precision measurements and for exploring parameter spaces of theoretical models.

  8. New Regularization Method for EXAFS Analysis

    NASA Astrophysics Data System (ADS)

    Reich, Tatiana Ye.; Korshunov, Maxim E.; Antonova, Tatiana V.; Ageev, Alexander L.; Moll, Henry; Reich, Tobias

    2007-02-01

    As an alternative to the analysis of EXAFS spectra by conventional shell fitting, the Tikhonov regularization method has been proposed. An improved algorithm that utilizes a priori information about the sample has been developed and applied to the analysis of U L3-edge spectra of soddyite, (UO2)2SiO4ṡ2H2O, and of U(VI) sorbed onto kaolinite. The partial radial distribution functions g1(UU), g2(USi), and g3(UO) of soddyite agree with crystallographic values and previous EXAFS results.

  9. New Regularization Method for EXAFS Analysis

    SciTech Connect

    Reich, Tatiana Ye.; Reich, Tobias; Korshunov, Maxim E.; Antonova, Tatiana V.; Ageev, Alexander L.; Moll, Henry

    2007-02-02

    As an alternative to the analysis of EXAFS spectra by conventional shell fitting, the Tikhonov regularization method has been proposed. An improved algorithm that utilizes a priori information about the sample has been developed and applied to the analysis of U L3-edge spectra of soddyite, (UO2)2SiO4{center_dot}2H2O, and of U(VI) sorbed onto kaolinite. The partial radial distribution functions g1(UU), g2(USi), and g3(UO) of soddyite agree with crystallographic values and previous EXAFS results.

  10. Probabilistic Computational Methods in Structural Failure Analysis

    NASA Astrophysics Data System (ADS)

    Krejsa, Martin; Kralik, Juraj

    2015-12-01

    Probabilistic methods are used in engineering where a computational model contains random variables. Each random variable in the probabilistic calculations contains uncertainties. Typical sources of uncertainties are properties of the material and production and/or assembly inaccuracies in the geometry or the environment where the structure should be located. The paper is focused on methods for the calculations of failure probabilities in structural failure and reliability analysis with special attention on newly developed probabilistic method: Direct Optimized Probabilistic Calculation (DOProC), which is highly efficient in terms of calculation time and the accuracy of the solution. The novelty of the proposed method lies in an optimized numerical integration that does not require any simulation technique. The algorithm has been implemented in mentioned software applications, and has been used several times in probabilistic tasks and probabilistic reliability assessments.

  11. Graphical methods for the sensitivity analysis in discriminant analysis

    DOE PAGESBeta

    Kim, Youngil; Anderson-Cook, Christine M.; Dae-Heung, Jang

    2015-09-30

    Similar to regression, many measures to detect influential data points in discriminant analysis have been developed. Many follow similar principles as the diagnostic measures used in linear regression in the context of discriminant analysis. Here we focus on the impact on the predicted classification posterior probability when a data point is omitted. The new method is intuitive and easily interpretative compared to existing methods. We also propose a graphical display to show the individual movement of the posterior probability of other data points when a specific data point is omitted. This enables the summaries to capture the overall pattern ofmore » the change.« less

  12. Graphical methods for the sensitivity analysis in discriminant analysis

    SciTech Connect

    Kim, Youngil; Anderson-Cook, Christine M.; Dae-Heung, Jang

    2015-09-30

    Similar to regression, many measures to detect influential data points in discriminant analysis have been developed. Many follow similar principles as the diagnostic measures used in linear regression in the context of discriminant analysis. Here we focus on the impact on the predicted classification posterior probability when a data point is omitted. The new method is intuitive and easily interpretative compared to existing methods. We also propose a graphical display to show the individual movement of the posterior probability of other data points when a specific data point is omitted. This enables the summaries to capture the overall pattern of the change.

  13. Degradation of learned skills: Effectiveness of practice methods on visual approach and landing skill retention

    NASA Technical Reports Server (NTRS)

    Sitterley, T. E.; Zaitzeff, L. P.; Berge, W. A.

    1972-01-01

    Flight control and procedural task skill degradation, and the effectiveness of retraining methods were evaluated for a simulated space vehicle approach and landing under instrument and visual flight conditions. Fifteen experienced pilots were trained and then tested after 4 months either without the benefits of practice or with static rehearsal, dynamic rehearsal or with dynamic warmup practice. Performance on both the flight control and procedure tasks degraded significantly after 4 months. The rehearsal methods effectively countered procedure task skill degradation, while dynamic rehearsal or a combination of static rehearsal and dynamic warmup practice was required for the flight control tasks. The quality of the retraining methods appeared to be primarily dependent on the efficiency of visual cue reinforcement.

  14. Understanding Online Teacher Best Practices: A Thematic Analysis to Improve Learning

    ERIC Educational Resources Information Center

    Corry, Michael; Ianacone, Robert; Stella, Julie

    2014-01-01

    The purpose of this study was to examine brick-and-mortar and online teacher best practice themes using thematic analysis and a newly developed theory-based analytic process entitled Synthesized Thematic Analysis Criteria (STAC). The STAC was developed to facilitate the meaningful thematic analysis of research based best practices of K-12…

  15. Spectroscopic Chemical Analysis Methods and Apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F.; Reid, Ray D.

    2012-01-01

    This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses

  16. Comparison of analysis methods for airway quantification

    NASA Astrophysics Data System (ADS)

    Odry, Benjamin L.; Kiraly, Atilla P.; Novak, Carol L.; Naidich, David P.

    2012-03-01

    Diseased airways have been known for several years as a possible contributing factor to airflow limitation in Chronic Obstructive Pulmonary Diseases (COPD). Quantification of disease severity through the evaluation of airway dimensions - wall thickness and lumen diameter - has gained increased attention, thanks to the availability of multi-slice computed tomography (CT). Novel approaches have focused on automated methods of measurement as a faster and more objective means that the visual assessment routinely employed in the clinic. Since the Full-Width Half-Maximum (FWHM) method of airway measurement was introduced two decades ago [1], several new techniques for quantifying airways have been detailed in the literature, but no approach has truly become a standard for such analysis. Our own research group has presented two alternative approaches for determining airway dimensions, one involving a minimum path and the other active contours [2, 3]. With an increasing number of techniques dedicated to the same goal, we decided to take a step back and analyze the differences of these methods. We consequently put to the test our two methods of analysis and the FWHM approach. We first measured a set of 5 airways from a phantom of known dimensions. Then we compared measurements from the three methods to those of two independent readers, performed on 35 airways in 5 patients. We elaborate on the differences of each approach and suggest conclusions on which could be defined as the best one.

  17. Measurement methods for human exposure analysis.

    PubMed Central

    Lioy, P J

    1995-01-01

    The general methods used to complete measurements of human exposures are identified and illustrations are provided for the cases of indirect and direct methods used for exposure analysis. The application of the techniques for external measurements of exposure, microenvironmental and personal monitors, are placed in the context of the need to test hypotheses concerning the biological effects of concern. The linkage of external measurements to measurements made in biological fluids is explored for a suite of contaminants. This information is placed in the context of the scientific framework used to conduct exposure assessment. Examples are taken from research on volatile organics and for a large scale problem: hazardous waste sites. PMID:7635110

  18. Text analysis devices, articles of manufacture, and text analysis methods

    DOEpatents

    Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C

    2013-05-28

    Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.

  19. Integrated Data Collection Analysis (IDCA) Program - SSST Testing Methods

    SciTech Connect

    Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.; Pollard, Colin J.; Warner, Kirstin F.; Remmers, Daniel L.; Sorensen, Daniel N.; Whinnery, LeRoy L.; Phillips, Jason J.; Shelley, Timothy J.; Reyes, Jose A.; Hsu, Peter C.; Reynolds, John G.

    2013-03-25

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the methods used for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis during the IDCA program. These methods changed throughout the Proficiency Test and the reasons for these changes are documented in this report. The most significant modifications in standard testing methods are: 1) including one specified sandpaper in impact testing among all the participants, 2) diversifying liquid test methods for selected participants, and 3) including sealed sample holders for thermal testing by at least one participant. This effort, funded by the Department of Homeland Security (DHS), is putting the issues of safe handling of these materials in perspective with standard military explosives. The study is adding SSST testing results for a broad suite of different HMEs to the literature. Ultimately the study will suggest new guidelines and methods and possibly establish the SSST testing accuracies needed to develop safe handling practices for HMEs. Each participating testing laboratory uses identical test materials and preparation methods wherever possible. The testing performers involved are Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Indian Head Division, Naval Surface Warfare Center, (NSWC IHD), Sandia National Laboratories (SNL), and Air Force Research Laboratory (AFRL/RXQL). These tests are conducted as a proficiency study in order to establish some consistency in test protocols, procedures, and experiments and to compare results when these testing variables cannot be made consistent.

  20. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    SciTech Connect

    Ronald Laurids Boring

    2010-11-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  1. Digital dream analysis: a revised method.

    PubMed

    Bulkeley, Kelly

    2014-10-01

    This article demonstrates the use of a digital word search method designed to provide greater accuracy, objectivity, and speed in the study of dreams. A revised template of 40 word search categories, built into the website of the Sleep and Dream Database (SDDb), is applied to four "classic" sets of dreams: The male and female "Norm" dreams of Hall and Van de Castle (1966), the "Engine Man" dreams discussed by Hobson (1988), and the "Barb Sanders Baseline 250" dreams examined by Domhoff (2003). A word search analysis of these original dream reports shows that a digital approach can accurately identify many of the same distinctive patterns of content found by previous investigators using much more laborious and time-consuming methods. The results of this study emphasize the compatibility of word search technologies with traditional approaches to dream content analysis.

  2. Quantitative gold nanoparticle analysis methods: A review.

    PubMed

    Yu, Lei; Andriola, Angelo

    2010-08-15

    Research and development in the area of gold nanoparticles' (AuNPs) preparation, characterization, and applications are burgeoning in recent years. Many of the techniques and protocols are very mature, but two major concerns are with the mass domestic production and the consumption of AuNP based products. First, how many AuNPs exist in a dispersion? Second, where are the AuNPs after digestion by the environment and how many are there? To answer these two questions, reliable and reproducible methods are needed to analyze the existence and the population of AuNP in samples. This review summarized the most recent chemical and particle quantitative analysis methods that have been used to characterize the concentration (in number of moles of gold per liter) or population (in number of particles per mL) of AuNPs. The methods summarized in this review include, mass spectroscopy, electroanalytical methods, spectroscopic methods, and particle counting methods. These methods may count the number of AuNP directly or analyze the total concentration of element gold in an AuNP dispersion.

  3. Provider practice models in ambulatory oncology practice: analysis of productivity, revenue, and provider and patient satisfaction.

    PubMed

    Buswell, Lori A; Ponte, Patricia Reid; Shulman, Lawrence N

    2009-07-01

    Physicians, nurse practitioners, and physician assistants often work in teams to deliver cancer care in ambulatory oncology practices. This is likely to become more prevalent as the demand for oncology services rises, and the number of providers increases only slightly.

  4. Impoverishment of practice: analysis of effects of economic discourses in home care case management practice.

    PubMed

    Ceci, Christine

    2006-03-01

    Home care is a health sector under increasing pressure. Demand is often said to be outstripping capacity, with constant change and retrenchment distinguishing features of the current context. This paper takes a reading of the current conditions of home care using data gathered during a field study of home care case management practices conducted in 2004. As economic discourses become increasingly influential in determining responses to client situations, case managers (and their managers) find themselves with limited capacity to exercise control over their practices. A growing gap between professionally influenced discourses--those presumably intended to guide practice--and organizational priorities creates a dissonance for case managers as the political-ethical dimensions of their practices are displaced by budget "realities." For front-line workers, such displacement cannot be sustained in their face-to-face encounters with clients, leading to a growing sense of frustration and powerlessness among these highly skilled practitioners.

  5. Finite Volume Methods: Foundation and Analysis

    NASA Technical Reports Server (NTRS)

    Barth, Timothy; Ohlberger, Mario

    2003-01-01

    Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

  6. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)

    2013-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  7. Probabilistic methods in fire-risk analysis

    SciTech Connect

    Brandyberry, M.D.

    1989-01-01

    The first part of this work outlines a method for assessing the frequency of ignition of a consumer product in a building and shows how the method would be used in an example scenario utilizing upholstered furniture as the product and radiant auxiliary heating devices (electric heaters, wood stoves) as the ignition source. Deterministic thermal models of the heat-transport processes are coupled with parameter uncertainty analysis of the models and with a probabilistic analysis of the events involved in a typical scenario. This leads to a distribution for the frequency of ignition for the product. In second part, fire-risk analysis as currently used in nuclear plants is outlines along with a discussion of the relevant uncertainties. The use of the computer code COMPBRN is discussed for use in the fire-growth analysis along with the use of response-surface methodology to quantify uncertainties in the code's use. Generalized response surfaces are developed for temperature versus time for a cable tray, as well as a surface for the hot gas layer temperature and depth for a room of arbitrary geometry within a typical nuclear power plant compartment. These surfaces are then used to simulate the cable tray damage time in a compartment fire experiment.

  8. Method of analysis and quality-assurance practices for determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry at the U.S. Geological Survey California District Organic Chemistry Laboratory, 1996-99

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Baker, Lucian M.; Kuivila, Kathryn M.

    2000-01-01

    A method of analysis and quality-assurance practices were developed to study the fate and transport of pesticides in the San Francisco Bay-Estuary by the U.S. Geological Survey. Water samples were filtered to remove suspended-particulate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide and the pesticides were eluted with three cartridge volumes of hexane:diethyl ether (1:1) solution. The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for pesticides ranged from 0.002 to 0.025 microgram per liter for 1-liter samples. Recoveries ranged from 44 to 140 percent for 25 pesticides in samples of organic-free reagent water and Sacramento-San Joaquin Delta and Suisun Bay water fortified at 0.05 and 0.50 microgram per liter. The estimated holding time for pesticides after extraction on C-8 solid-phase extraction cartridges ranged from 10 to 257 days.

  9. Translating Evidence Into Practice via Social Media: A Mixed-Methods Study

    PubMed Central

    Tunnecliff, Jacqueline; Morgan, Prue; Gaida, Jamie E; Clearihan, Lyn; Sadasivan, Sivalal; Davies, David; Ganesh, Shankar; Mohanty, Patitapaban; Weiner, John; Reynolds, John; Ilic, Dragan

    2015-01-01

    Background Approximately 80% of research evidence relevant to clinical practice never reaches the clinicians delivering patient care. A key barrier for the translation of evidence into practice is the limited time and skills clinicians have to find and appraise emerging evidence. Social media may provide a bridge between health researchers and health service providers. Objective The aim of this study was to determine the efficacy of social media as an educational medium to effectively translate emerging research evidence into clinical practice. Methods The study used a mixed-methods approach. Evidence-based practice points were delivered via social media platforms. The primary outcomes of attitude, knowledge, and behavior change were assessed using a preintervention/postintervention evaluation, with qualitative data gathered to contextualize the findings. Results Data were obtained from 317 clinicians from multiple health disciplines, predominantly from the United Kingdom, Australia, the United States, India, and Malaysia. The participants reported an overall improvement in attitudes toward social media for professional development (P<.001). The knowledge evaluation demonstrated a significant increase in knowledge after the training (P<.001). The majority of respondents (136/194, 70.1%) indicated that the education they had received via social media had changed the way they practice, or intended to practice. Similarly, a large proportion of respondents (135/193, 69.9%) indicated that the education they had received via social media had increased their use of research evidence within their clinical practice. Conclusions Social media may be an effective educational medium for improving knowledge of health professionals, fostering their use of research evidence, and changing their clinical behaviors by translating new research evidence into clinical practice. PMID:26503129

  10. Putting social impact assessment to the test as a method for implementing responsible tourism practice

    SciTech Connect

    McCombes, Lucy; Vanclay, Frank; Evers, Yvette

    2015-11-15

    The discourse on the social impacts of tourism needs to shift from the current descriptive critique of tourism to considering what can be done in actual practice to embed the management of tourism's social impacts into the existing planning, product development and operational processes of tourism businesses. A pragmatic approach for designing research methodologies, social management systems and initial actions, which is shaped by the real world operational constraints and existing systems used in the tourism industry, is needed. Our pilot study with a small Bulgarian travel company put social impact assessment (SIA) to the test to see if it could provide this desired approach and assist in implementing responsible tourism development practice, especially in small tourism businesses. Our findings showed that our adapted SIA method has value as a practical method for embedding a responsible tourism approach. While there were some challenges, SIA proved to be effective in assisting the staff of our test case tourism business to better understand their social impacts on their local communities and to identify actions to take. - Highlights: • Pragmatic approach is needed for the responsible management of social impacts of tourism. • Our adapted Social impact Assessment (SIA) method has value as a practical method. • SIA can be embedded into tourism businesses existing ‘ways of doing things’. • We identified challenges and ways to improve our method to better suit small tourism business context.

  11. Comparison between Two Practical Methods of Light Source Monitoring in Quantum Key Distribution

    NASA Astrophysics Data System (ADS)

    Wang, Gan; Chen, Ziyang; Xu, Bingjie; Li, Zhengyu; Peng, Xiang; Guo, Hong

    2016-05-01

    The practical security of a quantum key distribution (QKD) is a critical issue due to the loopholes opened by the imperfections of practical devices. The untrusted source problem is a fundamental issue that exists in almost every protocol, including the loss-tolerant protocol and the measurement-device-independent protocol. Two practical light source monitoring methods were proposed, i.e., two-threshold detector scheme and photon-number-resolving (PNR) detector scheme. In this work, we test the fluctuation level of different gain-switched pulsed lasers, i.e., the ratio between the standard deviation and the mean of the pulse energy (noted as γ) changes from 1% to 7%. Moreover, we propose an improved practical PNR detector scheme, and discuss in what circumstances one should use which light source monitoring method, i.e., generally speaking when the fluctuation is large the PNR detector method performs better. This provides an instruction of selecting proper monitoring module for different practical systems. This work is supported by the National Science Fund for Distinguished Young Scholars of China (Grant No. 61225003), the State Key Project of National Natural Science Foundation of China (Grant No. 61531003).

  12. Review of Computational Stirling Analysis Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  13. Data Analysis Methods for Library Marketing

    NASA Astrophysics Data System (ADS)

    Minami, Toshiro; Kim, Eunja

    Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

  14. Optical methods for the analysis of dermatopharmacokinetics

    NASA Astrophysics Data System (ADS)

    Lademann, Juergen; Weigmann, Hans-Juergen; von Pelchrzim, R.; Sterry, Wolfram

    2002-07-01

    The method of tape stripping in combination with spectroscopic measurements is a simple and noninvasive method for the analysis of dermatopharmacokinetics of cosmetic products and topically applied drugs. The absorbance at 430 nm was used for the characterization of the amount of corneocytes on the tape strips. It was compared to the increase of weight of the tapes after removing them from the skin surface. The penetration profiles of two UV filter substances used in sunscreens were determined. The combined method of tape stripping and spectroscopic measurements can be also used for the investigation of the dermatopharmacokinetics of topically applied drugs passing through the skin. Differences in the penetration profiles of the steroid compound clobetasol, applied in the same concentration in different formulations on the skin are presented.

  15. Effectiveness of a Motivation and Practical Skills Development Methods on the Oral Hygiene of Orphans Children in Kaunas, Lithuania

    PubMed Central

    Narbutaite, Julija

    2015-01-01

    ABSTRACT Objectives The aim of this study was to evaluate the effect of a motivation and practical skills development methods on the oral hygiene of orphans. Material and Methods Sixty eight orphans aged between 7 and 17 years from two orphanages in Kaunas were divided into two groups: practical application group and motivation group. Children were clinically examined by determining their oral hygiene status using Silness-Löe plaque index. Questionnaire was used to estimate the oral hygiene knowledge and practices at baseline and after 3 months. Statistical analysis included: Chi-square test (χ2), Fisher‘s exact test, Student‘s t-test, nonparametric Mann-Whitney test, Spearman’s rho correlation coefficient and Kappa coefficient. Results All children had a plaque on at least one tooth in both groups: motivation 1.14 (SD 0.51), practical application 1.08 (SD 0.4) (P = 0.58). Girls in both groups showed significantly better oral hygiene than boys (P < 0.001). After 3 months educational program oral hygiene status improved in both groups significantly 0.4 (SD 0.35) (P < 0.001). Significantly better oral hygiene was determined in practical application group 0.19 (SD 0.27) in comparison with motivation group 0.55 (SD 0.32) (P < 0.001). By comparing results of first and second questionnaire surveys on use of soft drinks, the statistically significant decline of their use was in both groups (P = 0.004). Conclusions Educational programs are effective in improving oral hygiene, especially when they’re based on practical skills training. PMID:26539284

  16. Impact of pedagogical method on Brazilian dental students' waste management practice.

    PubMed

    Victorelli, Gabriela; Flório, Flávia Martão; Ramacciato, Juliana Cama; Motta, Rogério Heládio Lopes; de Souza Fonseca Silva, Almenara

    2014-11-01

    The purpose of this study was to conduct a qualitative analysis of waste management practices among a group of Brazilian dental students (n=64) before and after implementing two different pedagogical methods: 1) the students attended a two-hour lecture based on World Health Organization standards; and 2) the students applied the lessons learned in an organized group setting aimed toward raising their awareness about socioenvironmental issues related to waste. All eligible students participated, and the students' learning was evaluated through their answers to a series of essay questions, which were quantitatively measured. Afterwards, the impact of the pedagogical approaches was compared by means of qualitative categorization of wastes generated in clinical activities. Waste categorization was performed for a period of eight consecutive days, both before and thirty days after the pedagogical strategies. In the written evaluation, 80 to 90 percent of the students' answers were correct. The qualitative assessment revealed a high frequency of incorrect waste disposal with a significant increase of incorrect disposal inside general and infectious waste containers (p<0.05). Although the students' theoretical learning improved, it was not enough to change behaviors established by cultural values or to encourage the students to adequately segregate and package waste material. PMID:25362694

  17. Moving environmental DNA methods from concept to practice for monitoring aquatic macroorganisms

    USGS Publications Warehouse

    Goldberg, Caren S.; Strickler, Katherine M.; Pilliod, David S.

    2015-01-01

    The discovery that macroorganisms can be detected from their environmental DNA (eDNA) in aquatic systems has immense potential for the conservation of biological diversity. This special issue contains 11 papers that review and advance the field of eDNA detection of vertebrates and other macroorganisms, including studies of eDNA production, transport, and degradation; sample collection and processing to maximize detection rates; and applications of eDNA for conservation using citizen scientists. This body of work is an important contribution to the ongoing efforts to take eDNA detection of macroorganisms from technical breakthrough to established, reliable method that can be used in survey, monitoring, and research applications worldwide. While the rapid advances in this field are remarkable, important challenges remain, including consensus on best practices for collection and analysis, understanding of eDNA diffusion and transport, and avoidance of inhibition in sample collection and processing. Nonetheless, as demonstrated in this special issue, eDNA techniques for research and monitoring are beginning to realize their potential for contributing to the conservation of biodiversity globally.

  18. BAROS METHOD CRITICAL ANALYSIS (BARIATRIC ANALYSIS AND REPORTING SYSTEM)

    PubMed Central

    NICARETA, Jean Ricardo; de FREITAS, Alexandre Coutinho Teixeira; NICARETA, Sheyla Maris; NICARETA, Cleiton; CAMPOS, Antonio Carlos Ligocki; NASSIF, Paulo Afonso Nunes; MARCHESINI, João Batista

    2015-01-01

    Introduction : Although it has received several criticisms, which is considered to be the most effective method used for global assessment of morbid obesity surgical treatment, still needs to be updated. Objective : Critical analysis of BAROS constitution and method. Method : BAROS as headings was searched in literature review using data from the main bariatric surgery journals until 2009. Results : Where found and assessed 121 papers containing criticisms on BAROS constitution and methodology. It has some failures and few researches show results on the use of this instrument, although it is still considered a standard method. Several authors that used it found imperfections in its methodology and suggested some changes addressed to improving its acceptance, showing the need of developing new methods to qualify the bariatric surgery results. Conclusion: BAROS constitution has failures and its methodology needs to be updated. PMID:26537280

  19. Developing Mentors: An Analysis of Shared Mentoring Practices

    ERIC Educational Resources Information Center

    Bower-Phipps, Laura; Klecka, Cari Van Senus; Sature, Amanda L.

    2016-01-01

    Understanding how experienced teachers share and articulate effective mentoring practices can guide efforts to prepare quality mentors. This qualitative study focused on mentoring practices within a teacher-designed student-teaching program conceptualized while the mentor teachers within the program were students in a graduate-level mentoring…

  20. Researching "Practiced Language Policies": Insights from Conversation Analysis

    ERIC Educational Resources Information Center

    Bonacina-Pugh, Florence

    2012-01-01

    In language policy research, "policy" has traditionally been conceptualised as a notion separate from that of "practice". In fact, language practices were usually analysed with a view to evaluate whether a policy is being implemented or resisted to. Recently, however, Spolsky in ("Language policy". Cambridge University press, Cambridge, 2004;…

  1. Situational Analysis: Centerless Systems and Human Service Practices

    ERIC Educational Resources Information Center

    Newbury, Janet

    2011-01-01

    Bronfenbrenner's ecological model is a conceptual framework that continues to contribute to human service practices. In the current article, the author describes the possibilities for practice made intelligible by drawing from this framework. She then explores White's "Web of Praxis" model as an important extension of this approach, and proceeds…

  2. Practical Implementation of New Particle Tracking Method to the Real Field of Groundwater Flow and Transport

    PubMed Central

    Suk, Heejun

    2012-01-01

    Abstract In articles published in 2009 and 2010, Suk and Yeh reported the development of an accurate and efficient particle tracking algorithm for simulating a path line under complicated unsteady flow conditions, using a range of elements within finite elements in multidimensions. Here two examples, an aquifer storage and recovery (ASR) example and a landfill leachate migration example, are examined to enhance the practical implementation of the proposed particle tracking method, known as Suk's method, to a real field of groundwater flow and transport. Results obtained by Suk's method are compared with those obtained by Pollock's method. Suk's method produces superior tracking accuracy, which suggests that Suk's method can describe more accurately various advection-dominated transport problems in a real field than existing popular particle tracking methods, such as Pollock's method. To illustrate the wide and practical applicability of Suk's method to random-walk particle tracking (RWPT), the original RWPT has been modified to incorporate Suk's method. Performance of the modified RWPT using Suk's method is compared with the original RWPT scheme by examining the concentration distributions obtained by the modified RWPT and the original RWPT under complicated transient flow systems. PMID:22476629

  3. Mixed-methods research in pharmacy practice: basics and beyond (part 1).

    PubMed

    Hadi, Muhammad Abdul; Alldred, David Phillip; Closs, S José; Briggs, Michelle

    2013-10-01

    This is the first of two papers which explore the use of mixed-methods research in pharmacy practice. In an era of evidence-based medicine and policy, high-quality research evidence is essential for the development of effective pharmacist-led services. Over the past decade, the use of mixed-methods research has become increasingly common in healthcare, although to date its use has been relatively limited in pharmacy practice research. In this article, the basic concepts of mixed-methods research including its definition, typologies and advantages in relation to pharmacy practice research are discussed. Mixed-methods research brings together qualitative and quantitative methodologies within a single study to answer or understand a research problem. There are a number of mixed-methods designs available, but the selection of an appropriate design must always be dictated by the research question. Importantly, mixed-methods research should not be seen as a 'tool' to collect qualitative and quantitative data, rather there should be some degree of 'integration' between the two data sets. If conducted appropriately, mixed-methods research has the potential to generate quality research evidence by combining strengths and overcoming the respective limitations of qualitative and quantitative methodologies.

  4. A numerical method for the stress analysis of stiffened-shell structures under nonuniform temperature distributions

    NASA Technical Reports Server (NTRS)

    Heldenfels, Richard R

    1951-01-01

    A numerical method is presented for the stress analysis of stiffened-shell structures of arbitrary cross section under nonuniform temperature distributions. The method is based on a previously published procedure that is extended to include temperature effects and multicell construction. The application of the method to practical problems is discussed and an illustrative analysis is presented of a two-cell box beam under the combined action of vertical loads and a nonuniform temperature distribution.

  5. Portraits of Practice: A Cross-Case Analysis of Two First-Grade Teachers and Their Grouping Practices

    ERIC Educational Resources Information Center

    Maloch, Beth; Worthy, Jo; Hampton, Angela; Jordan, Michelle; Hungerford-Kresser, Holly; Semingson, Peggy

    2013-01-01

    This interpretive study provides a cross-case analysis of the literacy instruction of two first-grade teachers, with a particular focus on their grouping practices. One key finding was the way in which these teachers drew upon a district-advocated approach for instruction--an approach to guided reading articulated by Fountas and Pinnell (1996) in…

  6. A new method for designing dual foil electron beam forming systems. II. Feasibility of practical implementation of the method

    NASA Astrophysics Data System (ADS)

    Adrich, Przemysław

    2016-05-01

    In Part I of this work a new method for designing dual foil electron beam forming systems was introduced. In this method, an optimal configuration of the dual foil system is found by means of a systematic, automatized scan of system performance in function of its parameters. At each point of the scan, Monte Carlo method is used to calculate the off-axis dose profile in water taking into account detailed and complete geometry of the system. The new method, while being computationally intensive, minimizes the involvement of the designer. In this Part II paper, feasibility of practical implementation of the new method is demonstrated. For this, a prototype software tools were developed and applied to solve a real life design problem. It is demonstrated that system optimization can be completed within few hours time using rather moderate computing resources. It is also demonstrated that, perhaps for the first time, the designer can gain deep insight into system behavior, such that the construction can be simultaneously optimized in respect to a number of functional characteristics besides the flatness of the off-axis dose profile. In the presented example, the system is optimized in respect to both, flatness of the off-axis dose profile and the beam transmission. A number of practical issues related to application of the new method as well as its possible extensions are discussed.

  7. A Practical Test Method for Mode I Fracture Toughness of Adhesive Joints with Dissimilar Substrates

    SciTech Connect

    Boeman, R.G.; Erdman, D.L.; Klett, L.B.; Lomax, R.D.

    1999-09-27

    A practical test method for determining the mode I fracture toughness of adhesive joints with dissimilar substrates will be discussed. The test method is based on the familiar Double Cantilever Beam (DCB) specimen geometry, but overcomes limitations in existing techniques that preclude their use when testing joints with dissimilar substrates. The test method is applicable to adhesive joints where the two bonded substrates have different flexural rigidities due to geometric and/or material considerations. Two specific features discussed are the use of backing beams to prevent substrate damage and a compliance matching scheme to achieve symmetric loading conditions. The procedure is demonstrated on a modified DCB specimen comprised of SRIM composite and thin-section, e-coat steel substrates bonded with an epoxy adhesive. Results indicate that the test method provides a practical means of characterizing the mode I fracture toughness of joints with dissimilar substrates.

  8. Bifurcation analysis method of nonlinear traffic phenomena

    NASA Astrophysics Data System (ADS)

    Ai, Wenhuan; Shi, Zhongke; Liu, Dawei

    2015-03-01

    A new bifurcation analysis method for analyzing and predicting the complex nonlinear traffic phenomena based on the macroscopic traffic flow model is presented in this paper. This method makes use of variable substitution to transform a traditional traffic flow model into a new model which is suitable for the stability analysis. Although the substitution seems to be simple, it can extend the range of the variable to infinity and build a relationship between the traffic congestion and the unstable system in the phase plane. So the problem of traffic flow could be converted into that of system stability. The analysis identifies the types and stabilities of the equilibrium solutions of the new model and gives the overall distribution structure of the nearby equilibrium solutions in the phase plane. Then we deduce the existence conditions of the models Hopf bifurcation and saddle-node bifurcation and find some bifurcations such as Hopf bifurcation, saddle-node bifurcation, Limit Point bifurcation of cycles and Bogdanov-Takens bifurcation. Furthermore, the Hopf bifurcation and saddle-node bifurcation are selected as the starting point of density temporal evolution and it will be helpful for improving our understanding of stop-and-go wave and local cluster effects observed in the free-way traffic.

  9. Passive Sampling Methods for Contaminated Sediments: Practical Guidance for Selection, Calibration, and Implementation

    EPA Science Inventory

    This article provides practical guidance on the use of passive sampling methods(PSMs) that target the freely dissolved concentration (Cfree) for improved exposure assessment of hydrophobic organic chemicals in sediments. Primary considerations for selecting a PSM for a specific a...

  10. Self-Assessment Methods in Writing Instruction: A Conceptual Framework, Successful Practices and Essential Strategies

    ERIC Educational Resources Information Center

    Nielsen, Kristen

    2014-01-01

    Student writing achievement is essential to lifelong learner success, but supporting writing can be challenging for teachers. Several large-scale analyses of publications on writing have called for further study of instructional methods, as the current literature does not sufficiently address the need to support best teaching practices.…

  11. Connecting Practice, Theory and Method: Supporting Professional Doctoral Students in Developing Conceptual Frameworks

    ERIC Educational Resources Information Center

    Kumar, Swapna; Antonenko, Pavlo

    2014-01-01

    From an instrumental view, conceptual frameworks that are carefully assembled from existing literature in Educational Technology and related disciplines can help students structure all aspects of inquiry. In this article we detail how the development of a conceptual framework that connects theory, practice and method is scaffolded and facilitated…

  12. The Delphi Method: An Approach for Facilitating Evidence Based Practice in Athletic Training

    ERIC Educational Resources Information Center

    Sandrey, Michelle A.; Bulger, Sean M.

    2008-01-01

    Objective: The growing importance of evidence based practice in athletic training is necessitating academics and clinicians to be able to make judgments about the quality or lack of the body of research evidence and peer-reviewed standards pertaining to clinical questions. To assist in the judgment process, consensus methods, namely brainstorming,…

  13. Econometric Methods for Causal Evaluation of Education Policies and Practices: A Non-Technical Guide

    ERIC Educational Resources Information Center

    Schlotter, Martin; Schwerdt, Guido; Woessmann, Ludger

    2011-01-01

    Education policy-makers and practitioners want to know which policies and practices can best achieve their goals. But research that can inform evidence-based policy often requires complex methods to distinguish causation from accidental association. Avoiding econometric jargon and technical detail, this paper explains the main idea and intuition…

  14. Examination of Quantitative Methods Used in Early Intervention Research: Linkages with Recommended Practices.

    ERIC Educational Resources Information Center

    Snyder, Patricia; Thompson, Bruce; McLean, Mary E.; Smith, Barbara J.

    2002-01-01

    Findings are reported related to the research methods and statistical techniques used in 450 group quantitative studies examined by the Council for Exceptional Children's Division for Early Childhood Recommended Practices Project. Studies were analyzed across seven dimensions including sampling procedures, variable selection, variable definition,…

  15. A high-efficiency aerothermoelastic analysis method

    NASA Astrophysics Data System (ADS)

    Wan, ZhiQiang; Wang, YaoKun; Liu, YunZhen; Yang, Chao

    2014-06-01

    In this paper, a high-efficiency aerothermoelastic analysis method based on unified hypersonic lifting surface theory is established. The method adopts a two-way coupling form that couples the structure, aerodynamic force, and aerodynamic thermo and heat conduction. The aerodynamic force is first calculated based on unified hypersonic lifting surface theory, and then the Eckert reference temperature method is used to solve the temperature field, where the transient heat conduction is solved using Fourier's law, and the modal method is used for the aeroelastic correction. Finally, flutter is analyzed based on the p-k method. The aerothermoelastic behavior of a typical hypersonic low-aspect ratio wing is then analyzed, and the results indicate the following: (1) the combined effects of the aerodynamic load and thermal load both deform the wing, which would increase if the flexibility, size, and flight time of the hypersonic aircraft increase; (2) the effect of heat accumulation should be noted, and therefore, the trajectory parameters should be considered in the design of hypersonic flight vehicles to avoid hazardous conditions, such as flutter.

  16. Methods for Proteomic Analysis of Transcription Factors

    PubMed Central

    Jiang, Daifeng; Jarrett, Harry W.; Haskins, William E.

    2009-01-01

    Investigation of the transcription factor (TF) proteome presents challenges including the large number of low abundance and post-translationally modified proteins involved. Specialized purification and analysis methods have been developed over the last decades which facilitate the study of the TF proteome and these are reviewed here. Generally applicable proteomics methods that have been successfully applied are also discussed. TFs are selectively purified by affinity techniques using the DNA response element (RE) as the basis for highly specific binding, and several agents have been discovered that either enhance binding or diminish non-specific binding. One such affinity method called “trapping” enables purification of TFs bound to nM concentrations and recovery of TF complexes in a highly purified state. The electrophoretic mobility shift assay (EMSA) is the most important assay of TFs because it provides both measures of the affinity and amount of the TF present. Southwestern (SW) blotting and DNA-protein crosslinking (DPC) allow in vitro estimates of DNA-binding-protein mass, while chromatin immunoprecipitation (ChIP) allows confirmation of promoter binding in vivo. Two-dimensional gel electrophoresis methods (2-DE), and 3-DE methods which combines EMSA with 2-DE, allow further resolution of TFs. The synergy of highly selective purification and analytical strategies has led to an explosion of knowledge about the TF proteome and the proteomes of other DNA- and RNA-binding proteins. PMID:19726046

  17. Foundational methods for model verification and uncertainty analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Jakeman, A. J.; Croke, B. F.; Guillaume, J. H.; Jakeman, J. D.; Shin, M.

    2013-12-01

    Before embarking on formal methods of uncertainty analysis that may entail unnecessarily restrictive assumptions and sophisticated treatment, prudence dictates exploring one's data, model candidates and applicable objective functions with a mixture of methods as a first step. It seems that there are several foundational methods that warrant more attention in practice and that there is scope for the development of new ones. Ensuing results from a selection of foundational methods may well inform the choice of formal methods and assumptions, or suffice in themselves as an effective appreciation of uncertainty. Through the case of four lumped rainfall-runoff models of varying complexity from several watersheds we illustrate that there are valuable methods, many of them already in open source software, others we have recently developed, which can be invoked to yield valuable insights into model veracity and uncertainty. We show results of using methods of global sensitivity analysis that help: determine whether insensitive parameters impact on predictions and therefore cannot be fixed; and identify which combinations of objective function, dataset and model structure allow insensitive parameters to be estimated. We apply response surface and polynomial chaos methods to yield knowledge of the models' response surfaces and parameter interactions, thereby informing model redesign. A new approach to model structure discrimination is presented based on Pareto methods and cross-validation. It reveals which model structures are acceptable in the sense that they are non-dominated by other structures across calibration and validation periods and across catchments according to specified performance criteria. Finally we present and demonstrate a falsification approach that shows the value of examining scenarios of model structures and parameters to identify any change that might have a specified effect on a prediction.

  18. Intercomparison of two nowcasting methods: preliminary analysis

    NASA Astrophysics Data System (ADS)

    Poli, V.; Alberoni, P. P.; Cesari, D.

    2008-10-01

    With term nowcasting is intended the description of a weather situation and its consequent extrapolation ahead in the future for few hours. This work gives a brief description of actual nowcasting methods deepening those developed at ARPA-SIM (Emilia-Romagna region, Italy). The methodology used rests on an extrapolation technique that analyses a series of radar reflectivity fields in order to identify areas of precipitation and determine the motion field which allows the tracking of coherent structures from an image to the next one. Motion of individual rainfall structures is extrapolated using two different methods: a linear translation and a semi-Lagrangian advection scheme. In particular semi-Lagrangian advection method is based on a multi-scale recursive cross-correlation analysis, where different targets are tracked at the different scales examined. This means that the motion of precipitation parcels is a function of scale. Description of selected validation tools introduces the numerical analysis of obtained results pointing out limits and limited outcomes of algorithms.

  19. An Analysis of State Autism Educational Assessment Practices and Requirements.

    PubMed

    Barton, Erin E; Harris, Bryn; Leech, Nancy; Stiff, Lillian; Choi, Gounah; Joel, Tiffany

    2016-03-01

    States differ in the procedures and criteria used to identify ASD. These differences are likely to impact the prevalence and age of identification for children with ASD. The purpose of the current study was to examine the specific state variations in ASD identification and eligibility criteria requirements. We examined variations by state in autism assessment practices and the proportion of children eligible for special education services under the autism category. Overall, our findings suggest that ASD identification practices vary across states, but most states use federal guidelines, at least in part, to set their requirements. Implications and recommendations for policy and practice are discussed.

  20. A practical algorithm for static analysis of parallel programs

    SciTech Connect

    McDowell, C.E. )

    1989-06-01

    One approach to analyzing the behavior of a concurrent program requires determining the reachable program states. A program state consists of a set of task states, the values of shared variables used for synchronization, and local variables that derive the values directly from synchronization operations. However, the number of reachable states rises exponentially with the number of tasks and becomes intractable for many concurrent programs. A variation of this approach merges a set of related states into a single virtual state. Using this approach, the analysis of concurrent programs becomes feasible as the number of virtual states is often orders of magnitude less than the number of reachable states. This paper presents a method for determining the virtual states that describe the reachable program states, and the reduction in the number of states is analyzed. The algorithms given have been implemented in a state program analyzer for multitasking Fortran, and the results obtained are discussed.

  1. Test versus analysis: A discussion of methods

    NASA Technical Reports Server (NTRS)

    Butler, T. G.

    1986-01-01

    Some techniques for comparing structural vibration data determined from test and analysis are discussed. Orthogonality is a general category of one group, correlation is a second, synthesis is a third and matrix improvement is a fourth. Advantages and short-comings of the methods are explored with suggestions as to how they can complement one another. The purpose for comparing vibration data from test and analysis for a given structure is to find out whether each is representing the dynamic properties of the structure in the same way. Specifically, whether: mode shapes are alike; the frequencies of the modes are alike; modes appear in the same frequency sequence; and if they are not alike, how to judge which to believe.

  2. Optimisation of Lime-Soda process parameters for reduction of hardness in aqua-hatchery practices using Taguchi methods.

    PubMed

    Yavalkar, S P; Bhole, A G; Babu, P V Vijay; Prakash, Chandra

    2012-04-01

    This paper presents the optimisation of Lime-Soda process parameters for the reduction of hardness in aqua-hatchery practices in the context of M. rosenbergii. The fresh water in the development of fisheries needs to be of suitable quality. Lack of desirable quality in available fresh water is generally the confronting restraint. On the Indian subcontinent, groundwater is the only source of raw water, having varying degree of hardness and thus is unsuitable for the fresh water prawn hatchery practices (M. rosenbergii). In order to make use of hard water in the context of aqua-hatchery, Lime-Soda process has been recommended. The efficacy of the various process parameters like lime, soda ash and detention time, on the reduction of hardness needs to be examined. This paper proposes to determine the parameter settings for the CIFE well water, which is pretty hard by using Taguchi experimental design method. Orthogonal Arrays of Taguchi, Signal-to-Noise Ratio, the analysis of variance (ANOVA) have been applied to determine their dosage and analysed for their effect on hardness reduction. The tests carried out with optimal levels of Lime-Soda process parameters confirmed the efficacy of the Taguchi optimisation method. Emphasis has been placed on optimisation of chemical doses required to reduce the total hardness using Taguchi method and ANOVA, to suit the available raw water quality for aqua-hatchery practices, especially for fresh water prawn M. rosenbergii.

  3. Internet Practices of Certified Rehabilitation Counselors and Analysis of Guidelines for Ethical Internet Practices

    ERIC Educational Resources Information Center

    Lehmann, Ilana S.; Crimando, William

    2011-01-01

    The Internet has become an integral part of the practice of rehabilitation counseling. To identify potential ethical issues regarding the use of the Internet by counselors, two studies were conducted. In Study 1, we surveyed a national sample of rehabilitation counselors regarding their use of technology in their work and home settings. Results…

  4. Thermal Analysis Methods for Aerobraking Heating

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.; Gasbarre, Joseph F.; Dec, John A.

    2005-01-01

    As NASA begins exploration of other planets, a method of non-propulsively slowing vehicles at the planet, aerobraking, may become a valuable technique for managing vehicle design mass and propellant. An example of this is Mars Reconnaissance Orbiter (MRO), which will launch in late 2005 and reach Mars in March of 2006. In order to save propellant, MRO will use aerobraking to modify the initial orbit at Mars. The spacecraft will dip into the atmosphere briefly on each orbit, and during the drag pass, the atmospheric drag on the spacecraft will slow it, thus lowering the orbit apoapsis. The largest area on the spacecraft, and that most affected by the heat generated during the aerobraking process, is the solar arrays. A thermal analysis of the solar arrays was conducted at NASA Langley, to simulate their performance throughout the entire roughly 6-month period of aerobraking. Several interesting methods were used to make this analysis more rapid and robust. Two separate models were built for this analysis, one in Thermal Desktop for radiation and orbital heating analysis, and one in MSC.Patran for thermal analysis. The results from the radiation model were mapped in an automated fashion to the Patran thermal model that was used to analyze the thermal behavior during the drag pass. A high degree of automation in file manipulation as well as other methods for reducing run time were employed, since toward the end of the aerobraking period the orbit period is short, and in order to support flight operations the runs must be computed rapidly. All heating within the Patran Thermal model was combined in one section of logic, such that data mapped from the radiation model and aeroheating model, as well as skin temperature effects on the aeroheating and surface radiation, could be incorporated easily. This approach calculates the aeroheating at any given node, based on its position and temperature as well as the density and velocity at that trajectory point. Run times on

  5. Practical post-calibration uncertainty analysis: Yucca Mountain, Nevada, USA

    NASA Astrophysics Data System (ADS)

    James, S. C.; Doherty, J.; Eddebbarh, A.

    2009-12-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is “calibrated.” Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the US’s proposed site for disposal of high-level radioactive waste. Both of these types uncertainty analysis are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction’s uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This paper applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. Furthermore, Monte Carlo simulations confirm that hydrogeologic units thought to be flow barriers have probability distributions skewed toward lower permeabilities.

  6. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  7. Method and apparatus for simultaneous spectroelectrochemical analysis

    DOEpatents

    Chatterjee, Sayandev; Bryan, Samuel A; Schroll, Cynthia A; Heineman, William R

    2013-11-19

    An apparatus and method of simultaneous spectroelectrochemical analysis is disclosed. A transparent surface is provided. An analyte solution on the transparent surface is contacted with a working electrode and at least one other electrode. Light from a light source is focused on either a surface of the working electrode or the analyte solution. The light reflected from either the surface of the working electrode or the analyte solution is detected. The potential of the working electrode is adjusted, and spectroscopic changes of the analyte solution that occur with changes in thermodynamic potentials are monitored.

  8. Numerical analysis method for linear induction machines.

    NASA Technical Reports Server (NTRS)

    Elliott, D. G.

    1972-01-01

    A numerical analysis method has been developed for linear induction machines such as liquid metal MHD pumps and generators and linear motors. Arbitrary phase currents or voltages can be specified and the moving conductor can have arbitrary velocity and conductivity variations from point to point. The moving conductor is divided into a mesh and coefficients are calculated for the voltage induced at each mesh point by unit current at every other mesh point. Combining the coefficients with the mesh resistances yields a set of simultaneous equations which are solved for the unknown currents.

  9. Scanning methods applied to bitemark analysis

    NASA Astrophysics Data System (ADS)

    Bush, Peter J.; Bush, Mary A.

    2010-06-01

    The 2009 National Academy of Sciences report on forensics focused criticism on pattern evidence subdisciplines in which statements of unique identity are utilized. One principle of bitemark analysis is that the human dentition is unique to the extent that a perpetrator may be identified based on dental traits in a bitemark. Optical and electron scanning methods were used to measure dental minutia and to investigate replication of detail in human skin. Results indicated that being a visco-elastic substrate, skin effectively reduces the resolution of measurement of dental detail. Conclusions indicate caution in individualization statements.

  10. Blood proteins analysis by Raman spectroscopy method

    NASA Astrophysics Data System (ADS)

    Artemyev, D. N.; Bratchenko, I. A.; Khristoforova, Yu. A.; Lykina, A. A.; Myakinin, O. O.; Kuzmina, T. P.; Davydkin, I. L.; Zakharov, V. P.

    2016-04-01

    This work is devoted to study the possibility of plasma proteins (albumin, globulins) concentration measurement using Raman spectroscopy setup. The blood plasma and whole blood were studied in this research. The obtained Raman spectra showed significant variation of intensities of certain spectral bands 940, 1005, 1330, 1450 and 1650 cm-1 for different protein fractions. Partial least squares regression analysis was used for determination of correlation coefficients. We have shown that the proposed method represents the structure and biochemical composition of major blood proteins.

  11. Apparatus and method for fluid analysis

    DOEpatents

    Wilson, Bary W.; Peters, Timothy J.; Shepard, Chester L.; Reeves, James H.

    2004-11-02

    The present invention is an apparatus and method for analyzing a fluid used in a machine or in an industrial process line. The apparatus has at least one meter placed proximate the machine or process line and in contact with the machine or process fluid for measuring at least one parameter related to the fluid. The at least one parameter is a standard laboratory analysis parameter. The at least one meter includes but is not limited to viscometer, element meter, optical meter, particulate meter, and combinations thereof.

  12. Apparatus And Method For Fluid Analysis

    DOEpatents

    Wilson, Bary W.; Peters, Timothy J.; Shepard, Chester L.; Reeves, James H.

    2003-05-13

    The present invention is an apparatus and method for analyzing a fluid used in a machine or in an industrial process line. The apparatus has at least one meter placed proximate the machine or process line and in contact with the machine or process fluid for measuring at least one parameter related to the fluid. The at least one parameter is a standard laboratory analysis parameter. The at least one meter includes but is not limited to viscometer, element meter, optical meter, particulate meter, and combinations thereof.

  13. On exploratory factor analysis: a review of recent evidence, an assessment of current practice, and recommendations for future use.

    PubMed

    Gaskin, Cadeyrn J; Happell, Brenda

    2014-03-01

    Exploratory factor analysis (hereafter, factor analysis) is a complex statistical method that is integral to many fields of research. Using factor analysis requires researchers to make several decisions, each of which affects the solutions generated. In this paper, we focus on five major decisions that are made in conducting factor analysis: (i) establishing how large the sample needs to be, (ii) choosing between factor analysis and principal components analysis, (iii) determining the number of factors to retain, (iv) selecting a method of data extraction, and (v) deciding upon the methods of factor rotation. The purpose of this paper is threefold: (i) to review the literature with respect to these five decisions, (ii) to assess current practices in nursing research, and (iii) to offer recommendations for future use. The literature reviews illustrate that factor analysis remains a dynamic field of study, with recent research having practical implications for those who use this statistical method. The assessment was conducted on 54 factor analysis (and principal components analysis) solutions presented in the results sections of 28 papers published in the 2012 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. The main findings from the assessment were that researchers commonly used (a) participants-to-items ratios for determining sample sizes (used for 43% of solutions), (b) principal components analysis (61%) rather than factor analysis (39%), (c) the eigenvalues greater than one rule and screen tests to decide upon the numbers of factors/components to retain (61% and 46%, respectively), (d) principal components analysis and unweighted least squares as methods of data extraction (61% and 19%, respectively), and (e) the Varimax method of rotation (44%). In general, well-established, but out-dated, heuristics and practices informed decision making with respect to the performance of factor analysis in nursing studies. Based on

  14. Mixed-methods research in pharmacy practice: recommendations for quality reporting. Part 2.

    PubMed

    Hadi, Muhammad Abdul; Alldred, David Phillip; Closs, S José; Briggs, Michelle

    2014-02-01

    This is the second of two papers that explore the use of mixed-methods research in pharmacy practice. This paper discusses the rationale, applications, limitations and challenges of conducting mixed-methods research. As with other research methods, the choice of mixed-methods should always be justified because not all research questions require a mixed-methods approach. Mixed-methods research is particularly suitable when one dataset may be inadequate in answering the research question, an explanation of initial results is required, generalizability of qualitative findings is desired or broader and deeper understanding of a research problem is necessary. Mixed-methods research has its own challenges and limitations, which should be considered carefully while designing the study. There is a need to improve the quality of reporting of mixed-methods research. A framework for reporting mixed-methods research is proposed, for researchers and reviewers, with the intention of improving its quality. Pharmacy practice research can benefit from research that uses both 'numbers' (quantitative) and 'words' (qualitative) to develop a strong evidence base to support pharmacy-led services.

  15. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 1 2012-04-01 2012-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... enforcement programs to utilize the methods of analysis of the AOAC INTERNATIONAL (AOAC) as published in...

  16. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 1 2014-04-01 2014-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... enforcement programs to utilize the methods of analysis of the AOAC INTERNATIONAL (AOAC) as published in...

  17. A Simplified Mesh Deformation Method Using Commercial Structural Analysis Software

    NASA Technical Reports Server (NTRS)

    Hsu, Su-Yuen; Chang, Chau-Lyan; Samareh, Jamshid

    2004-01-01

    Mesh deformation in response to redefined or moving aerodynamic surface geometries is a frequently encountered task in many applications. Most existing methods are either mathematically too complex or computationally too expensive for usage in practical design and optimization. We propose a simplified mesh deformation method based on linear elastic finite element analyses that can be easily implemented by using commercially available structural analysis software. Using a prescribed displacement at the mesh boundaries, a simple structural analysis is constructed based on a spatially varying Young s modulus to move the entire mesh in accordance with the surface geometry redefinitions. A variety of surface movements, such as translation, rotation, or incremental surface reshaping that often takes place in an optimization procedure, may be handled by the present method. We describe the numerical formulation and implementation using the NASTRAN software in this paper. The use of commercial software bypasses tedious reimplementation and takes advantage of the computational efficiency offered by the vendor. A two-dimensional airfoil mesh and a three-dimensional aircraft mesh were used as test cases to demonstrate the effectiveness of the proposed method. Euler and Navier-Stokes calculations were performed for the deformed two-dimensional meshes.

  18. A Mixed Methods Content Analysis of the Research Literature in Science Education

    ERIC Educational Resources Information Center

    Schram, Asta B.

    2014-01-01

    In recent years, more and more researchers in science education have been turning to the practice of combining qualitative and quantitative methods in the same study. This approach of using mixed methods creates possibilities to study the various issues that science educators encounter in more depth. In this content analysis, I evaluated 18…

  19. Stirling Analysis Comparison of Commercial vs. High-Order Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako

    2007-01-01

    Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/ proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's Compact scheme and Dyson s Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model although sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

  20. Stirling Analysis Comparison of Commercial Versus High-Order Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako

    2005-01-01

    Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's compact scheme and Dyson's Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model with sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

  1. Text analysis devices, articles of manufacture, and text analysis methods

    DOEpatents

    Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C

    2015-03-31

    Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes a display configured to depict visible images, and processing circuitry coupled with the display and wherein the processing circuitry is configured to access a first vector of a text item and which comprises a plurality of components, to access a second vector of the text item and which comprises a plurality of components, to weight the components of the first vector providing a plurality of weighted values, to weight the components of the second vector providing a plurality of weighted values, and to combine the weighted values of the first vector with the weighted values of the second vector to provide a third vector.

  2. Analysis of nonstandard and home-made explosives and post-blast residues in forensic practice

    NASA Astrophysics Data System (ADS)

    Kotrlý, Marek; Turková, Ivana

    2014-05-01

    Nonstandard and home-made explosives may constitute a considerable threat and as well as a potential material for terrorist activities. Mobile analytical devices, particularly Raman, or also FTIR spectrometers are used for the initial detection. Various sorts of phlegmatizers (moderants) to decrease sensitivity of explosives were tested, some kinds of low viscosity lubricants yielded very good results. If the character of the substance allows it, phlegmatized samples are taken in the amount of approx.0.3g for a laboratory analysis. Various separation methods and methods of concentrations of samples from post-blast scenes were tested. A wide range of methods is used for the laboratory analysis. XRD techniques capable of a direct phase identification of the crystalline substance, namely in mixtures, have highly proved themselves in practice for inorganic and organic phases. SEM-EDS/WDS methods are standardly employed for the inorganic phase. In analysing post-blast residues, there are very important techniques allowing analysis at the level of separate particles, not the overall composition in a mixed sample.

  3. Research methods to change clinical practice for patients with rare cancers.

    PubMed

    Billingham, Lucinda; Malottki, Kinga; Steven, Neil

    2016-02-01

    Rare cancers are a growing group as a result of reclassification of common cancers by molecular markers. There is therefore an increasing need to identify methods to assess interventions that are sufficiently robust to potentially affect clinical practice in this setting. Methods advocated for clinical trials in rare diseases are not necessarily applicable in rare cancers. This Series paper describes research methods that are relevant for rare cancers in relation to the range of incidence levels. Strategies that maximise recruitment, minimise sample size, or maximise the usefulness of the evidence could enable the application of conventional clinical trial design to rare cancer populations. Alternative designs that address specific challenges for rare cancers with the aim of potentially changing clinical practice include Bayesian designs, uncontrolled n-of-1 trials, and umbrella and basket trials. Pragmatic solutions must be sought to enable some level of evidence-based health care for patients with rare cancers.

  4. Analysis of newly proposed setpoint methods

    SciTech Connect

    Hines, J. W.; Miller, D. W.; Arndt, S. A.

    2006-07-01

    A new methodology for evaluating the operability of safety critical instrumentation has been proposed. Common to the prior method, a limiting trip setpoint (LSP) is determined to protect the analytical limit by considering uncertainties inherent in the measurement process. Channel operability is assured by periodically performing a channel operability test (COT) which compares the as-found trip point to the previous as-left trip point and evaluates the deviation. Licensees can include an additional conservative margin which results in a nominal trip setpoint (NSP) versus the LSP. If the setting tolerance is small as compared to the deviation limit, an alternate operability test can be applied that compares the as-found trip point to the LSP (or NSP as applicable) rather than the as-left trip setpoint. This method does not provide the actual channel deviation for operability determination so a penalty term may be appropriate. This paper provides an analysis of the alternate channel operability test and provides recommendations for setting a penalty term to reduce the non-conservativeness of the alternate channel operability test to a pre-defined value so as to preserve the required confidence level of the uncertainty analysis. (authors)

  5. Quantitative mass spectrometry methods for pharmaceutical analysis.

    PubMed

    Loos, Glenn; Van Schepdael, Ann; Cabooter, Deirdre

    2016-10-28

    Quantitative pharmaceutical analysis is nowadays frequently executed using mass spectrometry. Electrospray ionization coupled to a (hybrid) triple quadrupole mass spectrometer is generally used in combination with solid-phase extraction and liquid chromatography. Furthermore, isotopically labelled standards are often used to correct for ion suppression. The challenges in producing sensitive but reliable quantitative data depend on the instrumentation, sample preparation and hyphenated techniques. In this contribution, different approaches to enhance the ionization efficiencies using modified source geometries and improved ion guidance are provided. Furthermore, possibilities to minimize, assess and correct for matrix interferences caused by co-eluting substances are described. With the focus on pharmaceuticals in the environment and bioanalysis, different separation techniques, trends in liquid chromatography and sample preparation methods to minimize matrix effects and increase sensitivity are discussed. Although highly sensitive methods are generally aimed for to provide automated multi-residue analysis, (less sensitive) miniaturized set-ups have a great potential due to their ability for in-field usage.This article is part of the themed issue 'Quantitative mass spectrometry'.

  6. Physical activity assessment in practice: a mixed methods study of GPPAQ use in primary care

    PubMed Central

    2014-01-01

    Background Insufficient physical activity (PA) levels which increase the risk of chronic disease are reported by almost two-thirds of the population. More evidence is needed about how PA promotion can be effectively implemented in general practice (GP), particularly in socio-economically disadvantaged communities. One tool recommended for the assessment of PA in GP and supported by NICE (National Institute for Health and Care Excellence) is The General Practice Physical Activity Questionnaire (GPPAQ) but details of how it may be used and of its acceptability to practitioners and patients are limited. This study aims to examine aspects of GPPAQ administration in non-urgent patient contacts using different primary care electronic recording systems and to explore the views of health professionals regarding its use. Methods Four general practices, selected because of their location within socio-economically disadvantaged areas, were invited to administer GPPAQs to patients, aged 35-75 years, attending non-urgent consultations, over two-week periods. They used different methods of administration and different electronic medical record systems (EMIS, Premiere, Vision). Participants’ (general practitioners (GPs), nurses and receptionists) views regarding GPPAQ use were explored via questionnaires and focus groups. Results Of 2,154 eligible consultations, 192 (8.9%) completed GPPAQs; of these 83 (43%) were categorised as inactive. All practices were located within areas ranked as being in the tertile of greatest socio-economic deprivation in Northern Ireland. GPs/nurses in two practices invited completion of the GPPAQ, receptionists did so in two. One practice used an electronic template; three used paper copies of the questionnaires. End-of-study questionnaires, completed by 11 GPs, 3 nurses and 2 receptionists and two focus groups, with GPs (n = 8) and nurses (n = 4) indicated that practitioners considered the GPPAQ easy to use but not in every consultation

  7. Green analytical method development for statin analysis.

    PubMed

    Assassi, Amira Louiza; Roy, Claude-Eric; Perovitch, Philippe; Auzerie, Jack; Hamon, Tiphaine; Gaudin, Karen

    2015-02-01

    Green analytical chemistry method was developed for pravastatin, fluvastatin and atorvastatin analysis. HPLC/DAD method using ethanol-based mobile phase with octadecyl-grafted silica with various grafting and related-column parameters such as particle sizes, core-shell and monolith was studied. Retention, efficiency and detector linearity were optimized. Even for column with particle size under 2 μm, the benefit of keeping efficiency within a large range of flow rate was not obtained with ethanol based mobile phase compared to acetonitrile one. Therefore the strategy to shorten analysis by increasing the flow rate induced decrease of efficiency with ethanol based mobile phase. An ODS-AQ YMC column, 50 mm × 4.6 mm, 3 μm was selected which showed the best compromise between analysis time, statin separation, and efficiency. HPLC conditions were at 1 mL/min, ethanol/formic acid (pH 2.5, 25 mM) (50:50, v/v) and thermostated at 40°C. To reduce solvent consumption for sample preparation, 0.5mg/mL concentration of each statin was found the highest which respected detector linearity. These conditions were validated for each statin for content determination in high concentrated hydro-alcoholic solutions. Solubility higher than 100mg/mL was found for pravastatin and fluvastatin, whereas for atorvastatin calcium salt the maximum concentration was 2mg/mL for hydro-alcoholic binary mixtures between 35% and 55% of ethanol in water. Using atorvastatin instead of its calcium salt, solubility was improved. Highly concentrated solution of statins offered potential fluid for per Buccal Per-Mucous(®) administration with the advantages of rapid and easy passage of drugs. PMID:25582487

  8. Chapter 11. Community analysis-based methods

    SciTech Connect

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  9. A concise method for mine soils analysis

    SciTech Connect

    Winkler, S.; Wildeman, T.; Robinson, R.; Herron, J.

    1999-07-01

    A large number of abandoned hard rock mines exist in Colorado and other mountain west states, many on public property. Public pressure and resulting policy changes have become a driving force in the reclamation of these sites. Two of the key reclamation issues for these sites in the occurrence of acid forming materials (AFMs) in mine soils, and acid mine drainage (AMD) issuing from mine audits. An AMD treatment system design project for the Forest Queen mine in Colorado's San Juan mountains raised the need for a simple, useable method for analysis of mine land soils, both for suitability as a construction material, and to determine the AFM content and potential for acid release. The authors have developed a simple, stepwise, go - no go test for the analysis of mine soils. Samples were collected from a variety of sites in the Silverton, CO area, and subjected to three tiers of tests including: paste pH, Eh, and 10% HCl fizz test; then total digestion in HNO{sub 3}/HCl, neutralization potential, exposure to meteoric water, and toxicity content leaching procedure (TCLP). All elemental analyses were performed with an inductively-coupled plasma (ICP) spectrometer. Elimination of samples via the first two testing tiers left two remaining samples, which were subsequently subjected to column and sequential batch tests, with further elemental analysis by ICP. Based on these tests, one sample was chosen for suitability as a constructing material for the Forest Queen treatment system basins. Further simplification, and testing on two pairs of independent soil samples, has resulted in a final analytical method suitable for general use.

  10. Analysis of methods. [information systems evolution environment

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity

  11. Meaning and challenges in the practice of multiple therapeutic massage modalities: a combined methods study

    PubMed Central

    2011-01-01

    Background Therapeutic massage and bodywork (TMB) practitioners are predominantly trained in programs that are not uniformly standardized, and in variable combinations of therapies. To date no studies have explored this variability in training and how this affects clinical practice. Methods Combined methods, consisting of a quantitative, population-based survey and qualitative interviews with practitioners trained in multiple therapies, were used to explore the training and practice of TMB practitioners in Alberta, Canada. Results Of the 5242 distributed surveys, 791 were returned (15.1%). Practitioners were predominantly female (91.7%), worked in a range of environments, primarily private (44.4%) and home clinics (35.4%), and were not significantly different from other surveyed massage therapist populations. Seventy-seven distinct TMB therapies were identified. Most practitioners were trained in two or more therapies (94.4%), with a median of 8 and range of 40 therapies. Training programs varied widely in number and type of TMB components, training length, or both. Nineteen interviews were conducted. Participants described highly variable training backgrounds, resulting in practitioners learning unique combinations of therapy techniques. All practitioners reported providing individualized patient treatment based on a responsive feedback process throughout practice that they described as being critical to appropriately address the needs of patients. They also felt that research treatment protocols were different from clinical practice because researchers do not usually sufficiently acknowledge the individualized nature of TMB care provision. Conclusions The training received, the number of therapies trained in, and the practice descriptors of TMB practitioners are all highly variable. In addition, clinical experience and continuing education may further alter or enhance treatment techniques. Practitioners individualize each patient's treatment through a highly

  12. Statistical methods for the forensic analysis of striated tool marks

    SciTech Connect

    Hoeksema, Amy Beth

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  13. Practical hyperdynamics method for systems with large changes in potential energy

    NASA Astrophysics Data System (ADS)

    Hirai, Hirotoshi

    2014-12-01

    A practical hyperdynamics method is proposed to accelerate systems with highly endothermic and exothermic reactions such as hydrocarbon pyrolysis and oxidation reactions. In this method, referred to as the "adaptive hyperdynamics (AHD) method," the bias potential parameters are adaptively updated according to the change in potential energy. The approach is intensively examined for JP-10 (exo-tetrahydrodicyclopentadiene) pyrolysis simulations using the ReaxFF reactive force field. Valid boost parameter ranges are clarified as a result. It is shown that AHD can be used to model pyrolysis at temperatures as low as 1000 K while achieving a boost factor of around 105.

  14. Kaizen practice in healthcare: a qualitative analysis of hospital employees' suggestions for improvement

    PubMed Central

    Mazzocato, Pamela; Stenfors-Hayes, Terese; von Thiele Schwarz, Ulrica; Hasson, Henna

    2016-01-01

    Objectives Kaizen, or continuous improvement, lies at the core of lean. Kaizen is implemented through practices that enable employees to propose ideas for improvement and solve problems. The aim of this study is to describe the types of issues and improvement suggestions that hospital employees feel empowered to address through kaizen practices in order to understand when and how kaizen is used in healthcare. Methods We analysed 186 structured kaizen documents containing improvement suggestions that were produced by 165 employees at a Swedish hospital. Directed content analysis was used to categorise the suggestions into following categories: type of situation (proactive or reactive) triggering an action; type of process addressed (technical/administrative, support and clinical); complexity level (simple or complex); and type of outcomes aimed for (operational or sociotechnical). Compliance to the kaizen template was calculated. Results 72% of the improvement suggestions were reactions to a perceived problem. Support, technical and administrative, and primary clinical processes were involved in 47%, 38% and 16% of the suggestions, respectively. The majority of the kaizen documents addressed simple situations and focused on operational outcomes. The degree of compliance to the kaizen template was high for several items concerning the identification of problems and the proposed solutions, and low for items related to the test and implementation of solutions. Conclusions There is a need to combine kaizen practices with improvement and innovation practices that help staff and managers to address complex issues, such as the improvement of clinical care processes. The limited focus on sociotechnical aspects and the partial compliance to kaizen templates may indicate a limited understanding of the entire kaizen process and of how it relates to the overall organisational goals. This in turn can hamper the sustainability of kaizen practices and results. PMID:27473953

  15. The practice patterns of second trimester fetal ultrasonography: A questionnaire survey and an analysis of checklists

    PubMed Central

    Park, Hyun Soo; Hong, Joon Seok; Seol, Hyun-Joo; Hwang, Han Sung; Kim, Kunwoo; Ko, Hyun Sun; Kwak, Dong-Wook; Oh, Soo-young; Kim, Moon Young; Kim, Sa Jin

    2015-01-01

    Objective To analyze practice patterns and checklists of second trimester ultrasonography, and to investigate management plans when soft markers are detected among Korean Society of Ultrasound in Obstetrics and Gynecology (KSUOG) members. Methods An internet-based self-administered questionnaire survey was designed. KSUOG members were invited to the survey. Checklists of the second trimester ultrasonography were also requested. In the questionnaire survey, general practice patterns of the second trimester ultrasonography and management schemes of soft markers were asked. In the checklists analysis, the number of items were counted and also compared with those recommended by other medical societies. Results A total of 101 members responded. Eighty-seven percent routinely recommended second trimester fetal anatomic surveillance. Most (91.1%) performed it between 20+0 and 23+6 weeks of gestation. Written informed consents were given by 15.8% of respondents. Nearly 60% recommended genetic counseling when multiple soft markers and/or advanced maternal age were found. Similar tendencies were found in the managements of individual soft markers. However, practice patterns were very diverse and sometimes conflicting. Forty-eight checklists were analyzed in context with the number and content of the items. The median item number was 46.5 (range, 17 to 109). Of 49 items of checklists recommended by International Society of Ultrasound in Obstetrics and Gynecology and/or American Congress of Obstetricians and Gynecologists, 14 items (28.6%) were found in less than 50% of the checklists analyzed in this study. Conclusion Although general practice patterns were similar among KSUOG members, some of which were conflicting, and there is a need for standardization of the practice patterns and checklists of second trimester ultrasonography, which also have very wide range of spectrum. PMID:26623407

  16. International Commercial Remote Sensing Practices and Policies: A Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Stryker, Timothy

    In recent years, there has been much discussion about U.S. commercial remoteUnder the Act, the Secretary of Commerce sensing policies and how effectively theylicenses the operations of private U.S. address U.S. national security, foreignremote sensing satellite systems, in policy, commercial, and public interests.consultation with the Secretaries of Defense, This paper will provide an overview of U.S.State, and Interior. PDD-23 provided further commercial remote sensing laws,details concerning the operation of advanced regulations, and policies, and describe recentsystems, as well as criteria for the export of NOAA initiatives. It will also addressturnkey systems and/or components. In July related foreign practices, and the overall2000, pursuant to the authority delegated to legal context for trade and investment in thisit by the Secretary of Commerce, NOAA critical industry.iss ued new regulations for the industry. Licensing and Regulationsatellite systems. NOAA's program is The 1992 Land Remote Sensing Policy Act ("the Act"), and the 1994 policy on Foreign Access to Remote Sensing Space Capabilities (known as Presidential Decision Directive-23, or PDD-23) put into place an ambitious legal and policy framework for the U.S. Government's licensing of privately-owned, high-resolution satellite systems. Previously, capabilities afforded national security and observes the international obligations of the United States; maintain positive control of spacecraft operations; maintain a tasking record in conjunction with other record-keeping requirements; provide U.S. Government access to and use of data when required for national security or foreign policy purposes; provide for U.S. Government review of all significant foreign agreements; obtain U.S. Government approval for any encryption devices used; make available unenhanced data to a "sensed state" as soon as such data are available and on reasonable cost terms and conditions; make available unenhanced data as requested

  17. Analysis Method for Quantifying Vehicle Design Goals

    NASA Technical Reports Server (NTRS)

    Fimognari, Peter; Eskridge, Richard; Martin, Adam; Lee, Michael

    2007-01-01

    A document discusses a method for using Design Structure Matrices (DSM), coupled with high-level tools representing important life-cycle parameters, to comprehensively conceptualize a flight/ground space transportation system design by dealing with such variables as performance, up-front costs, downstream operations costs, and reliability. This approach also weighs operational approaches based on their effect on upstream design variables so that it is possible to readily, yet defensively, establish linkages between operations and these upstream variables. To avoid the large range of problems that have defeated previous methods of dealing with the complex problems of transportation design, and to cut down the inefficient use of resources, the method described in the document identifies those areas that are of sufficient promise and that provide a higher grade of analysis for those issues, as well as the linkages at issue between operations and other factors. Ultimately, the system is designed to save resources and time, and allows for the evolution of operable space transportation system technology, and design and conceptual system approach targets.

  18. A cross-sectional mixed methods study protocol to generate learning from patient safety incidents reported from general practice

    PubMed Central

    Carson-Stevens, Andrew; Hibbert, Peter; Avery, Anthony; Butlin, Amy; Carter, Ben; Cooper, Alison; Evans, Huw Prosser; Gibson, Russell; Luff, Donna; Makeham, Meredith; McEnhill, Paul; Panesar, Sukhmeet S; Parry, Gareth; Rees, Philippa; Shiels, Emma; Sheikh, Aziz; Ward, Hope Olivia; Williams, Huw; Wood, Fiona; Donaldson, Liam; Edwards, Adrian

    2015-01-01

    Introduction Incident reports contain descriptions of errors and harms that occurred during clinical care delivery. Few observational studies have characterised incidents from general practice, and none of these have been from the England and Wales National Reporting and Learning System. This study aims to describe incidents reported from a general practice care setting. Methods and analysis A general practice patient safety incident classification will be developed to characterise patient safety incidents. A weighted-random sample of 12 500 incidents describing no harm, low harm and moderate harm of patients, and all incidents describing severe harm and death of patients will be classified. Insights from exploratory descriptive statistics and thematic analysis will be combined to identify priority areas for future interventions. Ethics and dissemination The need for ethical approval was waivered by the Aneurin Bevan University Health Board research risk review committee given the anonymised nature of data (ABHB R&D Ref number: SA/410/13). The authors will submit the results of the study to relevant journals and undertake national and international oral presentations to researchers, clinicians and policymakers. PMID:26628526

  19. Influence of Analysis Methods on Interpretation of Hazard Maps

    PubMed Central

    Koehler, Kirsten A.

    2013-01-01

    Exposure or hazard mapping is becoming increasingly popular among industrial hygienists. Direct-reading instruments used for hazard mapping of data collection are steadily increasing in reliability and portability while decreasing in cost. Exposure measurements made with these instruments generally require no laboratory analysis although hazard mapping can be a time-consuming process. To inform decision making by industrial hygienists and management, it is crucial that the maps generated from mapping data are as accurate and representative as possible. Currently, it is unclear how many sampling locations are necessary to produce a representative hazard map. As such, researchers typically collect as many points as can be sampled in several hours and interpolation methods are used to produce higher resolution maps. We have reanalyzed hazard-mapping data sets from three industrial settings to determine which interpolation methods yield the most accurate results. The goal is to provide practicing industrial hygienists with some practical guidelines to generate accurate hazard maps with ‘off-the-shelf’ mapping software. Visually verifying the fit of the variogram model is crucial for accurate interpolation. Exponential and spherical variogram models performed better than Gaussian models. It was also necessary to diverge from some of the default interpolation parameters such as the number of bins used for the experimental variogram and whether or not to allow for a nugget effect to achieve reasonable accuracy of the interpolation for some data sets. PMID:23258453

  20. Multi-Spacecraft Turbulence Analysis Methods

    NASA Astrophysics Data System (ADS)

    Horbury, Tim S.; Osman, Kareem T.

    Turbulence is ubiquitous in space plasmas, from the solar wind to supernova remnants, and on scales from the electron gyroradius to interstellar separations. Turbulence is responsible for transporting energy across space and between scales and plays a key role in plasma heating, particle acceleration and thermalisation downstream of shocks. Just as with other plasma processes such as shocks or reconnection, turbulence results in complex, structured and time-varying behaviour which is hard to measure with a single spacecraft. However, turbulence is a particularly hard phenomenon to study because it is usually broadband in nature: it covers many scales simultaneously. One must therefore use techniques to extract information on multiple scales in order to quantify plasma turbulence and its effects. The Cluster orbit takes the spacecraft through turbulent regions with a range of characteristics: the solar wind, magnetosheath, cusp and magnetosphere. In each, the nature of the turbulence (strongly driven or fully evolved; dominated by kinetic effects or largely on fluid scales), as well as characteristics of the medium (thermalised or not; high or low plasma sub- or super-Alfvenic) mean that particular techniques are better suited to the analysis of Cluster data in different locations. In this chapter, we consider a range of methods and how they are best applied to these different regions. Perhaps the most studied turbulent space plasma environment is the solar wind, see Bruno and Carbone [2005]; Goldstein et al. [2005] for recent reviews. This is the case for a number of reasons: it is scientifically important for cosmic ray and solar energetic particle scattering and propagation, for example. However, perhaps the most significant motivations for studying solar wind turbulence are pragmatic: large volumes of high quality measurements are available; the stability of the solar wind on the scales of hours makes it possible to identify statistically stationary intervals to

  1. Comparing the Effect of Concept Mapping and Conventional Methods on Nursing Students’ Practical Skill Score

    PubMed Central

    Rasoul Zadeh, Nasrin; Sadeghi Gandomani, Hamidreza; Delaram, Masoumeh; Parsa Yekta, Zohre

    2015-01-01

    Background: Development of practical skills in the field of nursing education has remained a serious and considerable challenge in nursing education. Moreover, newly graduated nurses may have weak practical skills, which can be a threat to patients’ safety. Objectives: The present study was conducted to compare the effect of concept mapping and conventional methods on nursing students’ practical skills. Patients and Methods: This quasi-experimental study was conducted on 70 nursing students randomly assigned into two groups of 35 people. The intervention group was taught through concept mapping method, while the control group was taught using conventional method. A two-part instrument was used including a demographic information form and a checklist for direct observation of procedural skills. Descriptive statistics, chi-square, independent samples t-tests and paired t-test were used to analyze data. Results: Before education, no significant differences were observed between the two groups in the three skills of cleaning (P = 0.251), injection (P = 0.185) and sterilizing (P = 0.568). The students mean scores were significantly increased after the education and the difference between pre and post intervention of students mean scores were significant in the both groups (P < 0.001). However, after education, in all three skills the mean scores of the intervention group were significantly higher than the control group (P < 0.001). Conclusions: Concept mapping was superior to conventional skill teaching methods. It is suggested to use concept mapping in teaching practical courses such as fundamentals of nursing. PMID:26576441

  2. The development of a 3D risk analysis method.

    PubMed

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  3. /sup 252/Cf-source-driven neutron noise analysis method

    SciTech Connect

    Mihalczo, J.T.; King, W.T.; Blakeman, E.D.

    1985-01-01

    The /sup 252/Cf-source-driven neutron noise analysis method has been tested in a wide variety of experiments that have indicated the broad range of applicability of the method. The neutron multiplication factor k/sub eff/ has been satisfactorily detemined for a variety of materials including uranium metal, light water reactor fuel pins, fissile solutions, fuel plates in water, and interacting cylinders. For a uranyl nitrate solution tank which is typical of a fuel processing or reprocessing plant, the k/sub eff/ values were satisfactorily determined for values between 0.92 and 0.5 using a simple point kinetics interpretation of the experimental data. The short measurement times, in several cases as low as 1 min, have shown that the development of this method can lead to a practical subcriticality monitor for many in-plant applications. The further development of the method will require experiments oriented toward particular applications including dynamic experiments and the development of theoretical methods to predict the experimental observables.

  4. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  5. Searching Usenet for Virtual Communities of Practice: Using Mixed Methods to Identify the Constructs of Wenger's Theory

    ERIC Educational Resources Information Center

    Murillo, Enrique

    2008-01-01

    Introduction: This research set out to determine whether communities of practice can be entirely Internet-based by formally applying Wenger's theoretical framework to Internet collectives. Method: A model of a virtual community of practice was developed which included the constructs Wenger identified in co-located communities of practice: mutual…

  6. Honesty in Critically Reflective Essays: An Analysis of Student Practice

    ERIC Educational Resources Information Center

    Maloney, Stephen; Tai, Joanna Hong-Meng; Lo, Kristin; Molloy, Elizabeth; Ilic, Dragan

    2013-01-01

    In health professional education, reflective practice is seen as a potential means for self-improvement from everyday clinical encounters. This study aims to examine the level of student honesty in critical reflection, and barriers and facilitators for students engaging in honest reflection. Third year physiotherapy students, completing summative…

  7. Incorporating Computer-Aided Language Sample Analysis into Clinical Practice

    ERIC Educational Resources Information Center

    Price, Lisa Hammett; Hendricks, Sean; Cook, Colleen

    2010-01-01

    Purpose: During the evaluation of language abilities, the needs of the child are best served when multiple types and sources of data are included in the evaluation process. Current educational policies and practice guidelines further dictate the use of authentic assessment data to inform diagnosis and treatment planning. Language sampling and…

  8. Professional Learning in Rural Practice: A Sociomaterial Analysis

    ERIC Educational Resources Information Center

    Slade, Bonnie

    2013-01-01

    Purpose: This paper aims to examine the professional learning of rural police officers. Design/methodology/approach: This qualitative case study involved interviews and focus groups with 34 police officers in Northern Scotland. The interviews and focus groups were transcribed and analysed, drawing on practice-based and sociomaterial learning…

  9. An Analysis of Inservice Education Practices for Hospital Laboratory Personnel.

    ERIC Educational Resources Information Center

    Bonke, Barbara A.; And Others

    1988-01-01

    A study looked at inservice practices in clinical laboratories and at managers' perceptions of the impact and cost effectiveness of those activities. Findings indicate that most do not have an inservice budget and that new employee orientation, policy and procedure discussion, and instrumentation instruction are most effective. (JOW)

  10. An Analysis of Teacher Practices with Toddlers during Social Conflicts

    ERIC Educational Resources Information Center

    Gloeckler, Lissy R.; Cassell, Jennifer M.; Malkus, Amy J.

    2014-01-01

    Employing a quasi-experimental design, this pilot study on teacher practices with toddlers during social conflicts was conducted in the southeastern USA. Four child-care classrooms, teachers (n?=?8) and children (n?=?51) were assessed with the Classroom Assessment Scoring System -- Toddler [CLASS-Toddler; La Paro, K., Hamre, B. K., & Pianta,…

  11. Practical guidelines to select and scale earthquake records for nonlinear response history analysis of structures

    USGS Publications Warehouse

    Kalkan, Erol; Chopra, Anil K.

    2010-01-01

    Earthquake engineering practice is increasingly using nonlinear response history analysis (RHA) to demonstrate performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. Presented herein is a modal-pushover-based scaling (MPS) method to scale ground motions for use in nonlinear RHA of buildings and bridges. In the MPS method, the ground motions are scaled to match (to a specified tolerance) a target value of the inelastic deformation of the first-'mode' inelastic single-degree-of-freedom (SDF) system whose properties are determined by first-'mode' pushover analysis. Appropriate for first-?mode? dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-'mode' SDF system in selecting a subset of the scaled ground motions. Based on results presented for two bridges, covering single- and multi-span 'ordinary standard' bridge types, and six buildings, covering low-, mid-, and tall building types in California, the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.

  12. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  13. Accuracy Analysis of the PIC Method

    NASA Astrophysics Data System (ADS)

    Verboncoeur, J. P.; Cartwright, K. L.

    2000-10-01

    The discretization errors for many steps of the classical Particle-in-Cell (PIC) model have been well-studied (C. K. Birdsall and A. B. Langdon, Plasma Physics via Computer Simulation, McGraw-Hill, New York, NY (1985).) (R. W. Hockney and J. W. Eastwood, Computer Simulation Using Particles, McGraw-Hill, New York, NY (1981).). In this work, the errors in the interpolation algorithms, which provide the connection between continuum particles and discrete fields, are described in greater detail. In addition, the coupling of errors between steps in the method is derived. The analysis is carried out for both electrostatic and electromagnetic PIC models, and the results are demonstrated using a bounded one-dimensional electrostatic PIC code (J. P. Verboncoeur et al., J. Comput. Phys. 104, 321-328 (1993).), as well as a bounded two-dimensional electromagnetic PIC code (J. P. Verboncoeur et al., Comp. Phys. Comm. 87, 199-211 (1995).).

  14. [Methods for mortality analysis in SENTIERI Project].

    PubMed

    De Santis, M; Pasetto, R; Minelli, G; Conti, S

    2011-01-01

    The methods of mortality analysis in Italian polluted sites (IPS) are described. The study concerned 44 IPSs; each one included one or more municipalities. Mortality at municipality level was studied in the period 1995-2002, using the following indicators: crude rate, standardized rate, standardized mortality ratio (SMR), and SMR adjusted for an ad hoc deprivation index. Regional populations were used as reference for indirect standardization. The deprivation index was constructed using the 2001 national census variables representing the following socioeconomic domains: education, unemployment, dwelling ownership, overcrowding. Mortality indicators were computed for 63 single or grouped causes. The results for all the 63 analysed causes of death are available for each IPS, and in this Chapter the results for each IPS for causes selected on the basis of a priori evidence of risk from local sources of environmental pollution are presented. The procedures and results of the evidence evaluation have been published in the 2010 Supplement of Epidemiology & Prevention devoted to SENTIERI.

  15. Introduction of the carbon dioxide absorption method with closed circle breathing into anesthesia practice.

    PubMed

    Foregger, R

    2000-07-01

    The circle breathing CO2 absorption system for use during acetylene anesthesia was described by Carl Gauss in 1924/1925. The apparatus was manufactured by Drägerwerk of Lübeck. A considerable number of publications on the apparatus employing the closed circle method of CO2 absorption appeared in the medical press soon thereafter. Later apparatus models, also built by Drägerwerk, were adapted for nitrous oxide-oxygen-ether anesthesia and introduced into practice by Paul Sudeck and Helmut Schmidt. Information about all this was transmitted to America through the German medical press, including the Draeger-Hefte. American anesthesia machine manufacturers began to develop closed circle CO2 absorbers several years later. Claims that the circle breathing CO2 absorption method was introduced into anesthesia practice by Brian Sword are not valid. PMID:10969391

  16. Integrated force method versus displacement method for finite element analysis

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Berke, Laszlo; Gallagher, Richard H.

    1990-01-01

    A novel formulation termed the integrated force method (IFM) has been developed in recent years for analyzing structures. In this method all the internal forces are taken as independent variables, and the system equilibrium equations (EE's) are integrated with the global compatibility conditions (CC's) to form the governing set of equations. In IFM the CC's are obtained from the strain formulation of St. Venant, and no choices of redundant load systems have to be made, in constrast to the standard force method (SFM). This property of IFM allows the generation of the governing equation to be automated straightforwardly, as it is in the popular stiffness method (SM). In this report IFM and SM are compared relative to the structure of their respective equations, their conditioning, required solution methods, overall computational requirements, and convergence properties as these factors influence the accuracy of the results. Overall, this new version of the force method produces more accurate results than the stiffness method for comparable computational cost.

  17. A comparative analysis of ethnomedicinal practices for treating gastrointestinal disorders used by communities living in three national parks (Korea).

    PubMed

    Kim, Hyun; Song, Mi-Jang; Brian, Heldenbrand; Choi, Kyoungho

    2014-01-01

    The purpose of this study is to comparatively analyze the ethnomedicinal practices on gastrointestinal disorders within communities in Jirisan National Park, Gayasan National Park, and Hallasan National Park of Korea. Data was collected through participant observations and indepth interviews with semistructured questionnaires. Methods for comparative analysis were accomplished using the informant consensus factor, fidelity level, and internetwork analysis. A total of 490 ethnomedicinal practices recorded from the communities were classified into 110 families, 176 genera, and 220 species that included plants, animals, fungi, and alga. The informant consensus factor values in the disorder categories were enteritis, and gastralgia (1.0), followed by indigestion (0.94), constipation (0.93), and abdominal pain and gastroenteric trouble (0.92). In terms of fidelity levels, 71 plant species showed fidelity levels of 100%. The internetwork analysis between disorders and all medicinal species are grouped in the center by the four categories of indigestion, diarrhea, abdominal pain, and gastroenteric trouble, respectively. Regarding the research method of this study, the comparative analysis methods will contribute to the availability of orally transmitted ethnomedicinal knowledge. Among the methods of analysis, the use of internetwork analysis as a tool for analysis in this study provides imperative internetwork maps between gastrointestinal disorders and medicinal species.

  18. A Comparative Analysis of Ethnomedicinal Practices for Treating Gastrointestinal Disorders Used by Communities Living in Three National Parks (Korea)

    PubMed Central

    Kim, Hyun; Song, Mi-Jang; Brian, Heldenbrand; Choi, Kyoungho

    2014-01-01

    The purpose of this study is to comparatively analyze the ethnomedicinal practices on gastrointestinal disorders within communities in Jirisan National Park, Gayasan National Park, and Hallasan National Park of Korea. Data was collected through participant observations and indepth interviews with semistructured questionnaires. Methods for comparative analysis were accomplished using the informant consensus factor, fidelity level, and internetwork analysis. A total of 490 ethnomedicinal practices recorded from the communities were classified into 110 families, 176 genera, and 220 species that included plants, animals, fungi, and alga. The informant consensus factor values in the disorder categories were enteritis, and gastralgia (1.0), followed by indigestion (0.94), constipation (0.93), and abdominal pain and gastroenteric trouble (0.92). In terms of fidelity levels, 71 plant species showed fidelity levels of 100%. The internetwork analysis between disorders and all medicinal species are grouped in the center by the four categories of indigestion, diarrhea, abdominal pain, and gastroenteric trouble, respectively. Regarding the research method of this study, the comparative analysis methods will contribute to the availability of orally transmitted ethnomedicinal knowledge. Among the methods of analysis, the use of internetwork analysis as a tool for analysis in this study provides imperative internetwork maps between gastrointestinal disorders and medicinal species. PMID:25202330

  19. Gap analysis: Concepts, methods, and recent results

    USGS Publications Warehouse

    Jennings, M.D.

    2000-01-01

    Rapid progress is being made in the conceptual, technical, and organizational requirements for generating synoptic multi-scale views of the earth's surface and its biological content. Using the spatially comprehensive data that are now available, researchers, land managers, and land-use planners can, for the first time, quantitatively place landscape units - from general categories such as 'Forests' or 'Cold-Deciduous Shrubland Formation' to more categories such as 'Picea glauca-Abies balsamea-Populus spp. Forest Alliance' - in their large-area contexts. The National Gap Analysis Program (GAP) has developed the technical and organizational capabilities necessary for the regular production and analysis of such information. This paper provides a brief overview of concepts and methods as well as some recent results from the GAP projects. Clearly, new frameworks for biogeographic information and organizational cooperation are needed if we are to have any hope of documenting the full range of species occurrences and ecological processes in ways meaningful to their management. The GAP experience provides one model for achieving these new frameworks.

  20. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    NASA Technical Reports Server (NTRS)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  1. Reform-based science teaching: A mixed-methods approach to explaining variation in secondary science teacher practice

    NASA Astrophysics Data System (ADS)

    Jetty, Lauren E.

    The purpose of this two-phase, sequential explanatory mixed-methods study was to understand and explain the variation seen in secondary science teachers' enactment of reform-based instructional practices. Utilizing teacher socialization theory, this mixed-methods analysis was conducted to determine the relative influence of secondary science teachers' characteristics, backgrounds and experiences across their teacher development to explain the range of teaching practices exhibited by graduates from three reform-oriented teacher preparation programs. Data for this study were obtained from the Investigating the Meaningfulness of Preservice Programs Across the Continuum of Teaching (IMPPACT) Project, a multi-university, longitudinal study funded by NSF. In the first quantitative phase of the study, data for the sample (N=120) were collected from three surveys from the IMPPACT Project database. Hierarchical multiple regression analysis was used to examine the separate as well as the combined influence of factors such as teachers' personal and professional background characteristics, beliefs about reform-based science teaching, feelings of preparedness to teach science, school context, school culture and climate of professional learning, and influences of the policy environment on the teachers' use of reform-based instructional practices. Findings indicate three blocks of variables, professional background, beliefs/efficacy, and local school context added significant contribution to explaining nearly 38% of the variation in secondary science teachers' use of reform-based instructional practices. The five variables that significantly contributed to explaining variation in teachers' use of reform-based instructional practices in the full model were, university of teacher preparation, sense of preparation for teaching science, the quality of professional development, science content focused professional, and the perceived level of professional autonomy. Using the results

  2. Method and apparatus for frequency spectrum analysis

    NASA Technical Reports Server (NTRS)

    Cole, Steven W. (Inventor)

    1992-01-01

    A method for frequency spectrum analysis of an unknown signal in real-time is discussed. The method is based upon integration of 1-bit samples of signal voltage amplitude corresponding to sine or cosine phases of a controlled center frequency clock which is changed after each integration interval to sweep the frequency range of interest in steps. Integration of samples during each interval is carried out over a number of cycles of the center frequency clock spanning a number of cycles of an input signal to be analyzed. The invention may be used to detect the frequency of at least two signals simultaneously. By using a reference signal of known frequency and voltage amplitude (added to the two signals for parallel processing in the same way, but in a different channel with a sampling at the known frequency and phases of the reference signal), the absolute voltage amplitude of the other two signals may be determined by squaring the sine and cosine integrals of each channel and summing the squares to obtain relative power measurements in all three channels and, from the known voltage amplitude of the reference signal, obtaining an absolute voltage measurement for the other two signals by multiplying the known voltage of the reference signal with the ratio of the relative power of each of the other two signals to the relative power of the reference signal.

  3. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Service, Dairy Programs, or Official Methods of Analysis of the Association of Analytical Chemists or... 7 Agriculture 3 2011-01-01 2011-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods...

  4. A practical field extraction method for non-invasive monitoring of hormone activity in the black rhinoceros.

    PubMed

    Edwards, Katie L; McArthur, Hannah M; Liddicoat, Tim; Walker, Susan L

    2014-01-01

    Non-invasive hormone analysis is a vital tool in assessing an animal's adrenal and reproductive status, which can be beneficial to in situ and ex situ conservation. However, it can be difficult to employ these techniques when monitoring in situ populations away from controlled laboratory conditions, when electricity is not readily available. A practical method for processing faecal samples in the field, which enables samples to be extracted soon after defaecation and stored in field conditions for prolonged periods prior to hormone analysis, is therefore warranted. This study describes the development of an optimal field extraction method, which includes hand-shaking faecal material in 90% methanol, before loading this extract in a 40% solvent onto HyperSep™ C8 solid-phase extraction cartridges, stored at ambient temperatures. This method was successfully validated for measurement of adrenal and reproductive hormone metabolites in faeces of male and female black rhinoceros (Diceros bicornis) and was rigorously tested in controlled laboratory and simulated field conditions. All the hormones tested demonstrated between 83 and 94% and between 42 and 89% recovery of synthetic and endogenous hormone metabolites, respectively, with high precision of replication. Furthermore, results obtained following the developed optimal field extraction method were highly correlated with the control laboratory method. Cartridges can be stored at ambient (cool, dry or warm, humid) conditions for periods of up to 6 months without degradation, before re-extraction of hormone metabolites for analysis by enzyme immunoassay. The described method has great potential to be applied to monitor faecal reproductive and adrenal hormone metabolites in a wide variety of species and allows samples to be stored in the field for up to 6 months prior to analysis. This provides the opportunity to investigate hormone relationships within in situ populations, where equipment and facilities may

  5. A practical field extraction method for non-invasive monitoring of hormone activity in the black rhinoceros

    PubMed Central

    Edwards, Katie L.; McArthur, Hannah M.; Liddicoat, Tim; Walker, Susan L.

    2014-01-01

    Non-invasive hormone analysis is a vital tool in assessing an animal's adrenal and reproductive status, which can be beneficial to in situ and ex situ conservation. However, it can be difficult to employ these techniques when monitoring in situ populations away from controlled laboratory conditions, when electricity is not readily available. A practical method for processing faecal samples in the field, which enables samples to be extracted soon after defaecation and stored in field conditions for prolonged periods prior to hormone analysis, is therefore warranted. This study describes the development of an optimal field extraction method, which includes hand-shaking faecal material in 90% methanol, before loading this extract in a 40% solvent onto HyperSep™ C8 solid-phase extraction cartridges, stored at ambient temperatures. This method was successfully validated for measurement of adrenal and reproductive hormone metabolites in faeces of male and female black rhinoceros (Diceros bicornis) and was rigorously tested in controlled laboratory and simulated field conditions. All the hormones tested demonstrated between 83 and 94% and between 42 and 89% recovery of synthetic and endogenous hormone metabolites, respectively, with high precision of replication. Furthermore, results obtained following the developed optimal field extraction method were highly correlated with the control laboratory method. Cartridges can be stored at ambient (cool, dry or warm, humid) conditions for periods of up to 6 months without degradation, before re-extraction of hormone metabolites for analysis by enzyme immunoassay. The described method has great potential to be applied to monitor faecal reproductive and adrenal hormone metabolites in a wide variety of species and allows samples to be stored in the field for up to 6 months prior to analysis. This provides the opportunity to investigate hormone relationships within in situ populations, where equipment and facilities may

  6. A practical material decomposition method for x-ray dual spectral computed tomography.

    PubMed

    Hu, Jingjing; Zhao, Xing

    2016-03-17

    X-ray dual spectral CT (DSCT) scans the measured object with two different x-ray spectra, and the acquired rawdata can be used to perform the material decomposition of the object. Direct calibration methods allow a faster material decomposition for DSCT and can be separated in two groups: image-based and rawdata-based. The image-based method is an approximative method, and beam hardening artifacts remain in the resulting material-selective images. The rawdata-based method generally obtains better image quality than the image-based method, but this method requires geometrically consistent rawdata. However, today's clinical dual energy CT scanners usually measure different rays for different energy spectra and acquire geometrically inconsistent rawdata sets, and thus cannot meet the requirement. This paper proposes a practical material decomposition method to perform rawdata-based material decomposition in the case of inconsistent measurement. This method first yields the desired consistent rawdata sets from the measured inconsistent rawdata sets, and then employs rawdata-based technique to perform material decomposition and reconstruct material-selective images. The proposed method was evaluated by use of simulated FORBILD thorax phantom rawdata and dental CT rawdata, and simulation results indicate that this method can produce highly quantitative DSCT images in the case of inconsistent DSCT measurements. PMID:27257878

  7. A practical material decomposition method for x-ray dual spectral computed tomography.

    PubMed

    Hu, Jingjing; Zhao, Xing

    2016-03-17

    X-ray dual spectral CT (DSCT) scans the measured object with two different x-ray spectra, and the acquired rawdata can be used to perform the material decomposition of the object. Direct calibration methods allow a faster material decomposition for DSCT and can be separated in two groups: image-based and rawdata-based. The image-based method is an approximative method, and beam hardening artifacts remain in the resulting material-selective images. The rawdata-based method generally obtains better image quality than the image-based method, but this method requires geometrically consistent rawdata. However, today's clinical dual energy CT scanners usually measure different rays for different energy spectra and acquire geometrically inconsistent rawdata sets, and thus cannot meet the requirement. This paper proposes a practical material decomposition method to perform rawdata-based material decomposition in the case of inconsistent measurement. This method first yields the desired consistent rawdata sets from the measured inconsistent rawdata sets, and then employs rawdata-based technique to perform material decomposition and reconstruct material-selective images. The proposed method was evaluated by use of simulated FORBILD thorax phantom rawdata and dental CT rawdata, and simulation results indicate that this method can produce highly quantitative DSCT images in the case of inconsistent DSCT measurements.

  8. Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis

    PubMed Central

    Critchfield, Thomas S

    2011-01-01

    Neither practitioners nor scientists appear to be fully satisfied with the world's largest behavior-analytic membership organization. Each community appears to believe that initiatives that serve the other will undermine the association's capacity to serve their own needs. Historical examples suggest that such discord is predicted when practitioners and scientists cohabit the same association. This is true because all professional associations exist to address guild interests, and practice and science are different professions with different guild interests. No association, therefore, can succeed in being all things to all people. The solution is to assure that practice and science communities are well served by separate professional associations. I comment briefly on how this outcome might be promoted. PMID:22532750

  9. Instructional methods used by health sciences librarians to teach evidence-based practice (EBP): a systematic review*†‡

    PubMed Central

    Swanberg, Stephanie M.; Dennison, Carolyn Ching; Farrell, Alison; Machel, Viola; Marton, Christine; O'Brien, Kelly K.; Pannabecker, Virginia; Thuna, Mindy; Holyoke, Assako Nitta

    2016-01-01

    Background Librarians often teach evidence-based practice (EBP) within health sciences curricula. It is not known what teaching methods are most effective. Methods A systematic review of the literature was conducted searching CINAHL, EMBASE, ERIC, LISTA, PubMed, Scopus, and others. Searches were completed through December 2014. No limits were applied. Hand searching of Medical Library Association annual meeting abstracts from 2009–2014 was also completed. Studies must be about EBP instruction by a librarian within undergraduate or graduate health sciences curricula and include skills assessment. Studies with no assessment, letters and comments, and veterinary education studies were excluded. Data extraction and critical appraisal were performed to determine the risk of bias of each study. Results Twenty-seven studies were included for analysis. Studies occurred in the United States (20), Canada (3), the United Kingdom (1), and Italy (1), with 22 in medicine and 5 in allied health. Teaching methods included lecture (20), small group or one-on-one instruction (16), computer lab practice (15), and online learning (6). Assessments were quizzes or tests, pretests and posttests, peer-review, search strategy evaluations, clinical scenario assignments, or a hybrid. Due to large variability across studies, meta-analysis was not conducted. Discussion Findings were weakly significant for positive change in search performance for most studies. Only one study compared teaching methods, and no one teaching method proved more effective. Future studies could conduct multisite interventions using randomized or quasi-randomized controlled trial study design and standardized assessment tools to measure outcomes. PMID:27366120

  10. A shift in HIV reporting practices: a biopolitical analysis.

    PubMed

    Beckerman, N L; Gelman, S R

    2000-01-01

    The Center for Disease Control and Prevention (CDC), mainstream medical journals and many state legislatures are calling for a drastic shift in what has become standard HIV reporting practice throughout the United States (Burke, 1997; Richardson, 1997a; Rotello, 1997). We are now experiencing a challenge to the long held practice of confidential and anonymous HIV reporting (Grumman, 1997; Richardson, 1998). The federal government, the American Medical Association and several major AIDS organizations have supported state level proposals that require public health officials to adopt name reporting, i.e., monitoring HIV by name not by number (Richardson, 1997b; Richardson, 1997c; Kong, 1997). With number identification, the practice commonly used, there is no link made between number and the patient's name thus assuring anonymity. This article details this shift in public health policy and the biopolitical factors surrounding HIV testing and reporting. This shift is one in which the "duty to warn" takes precedence over the long held professional value of confidentiality.

  11. Practice size and quality attainment under the new GMS contract: a cross-sectional analysis

    PubMed Central

    Wang, Yingying; O'Donnell, Catherine A; Mackay, Daniel F; Watt, Graham CM

    2006-01-01

    Background The Quality and Outcomes Framework (QOF) of the new General Medical Services contract, for the first time, incentivises certain areas of general practice workload over others. The ability of practices to deliver high quality care may be related to the size of the practice itself. Aim To explore the relationship between practice size and points attained in the QOF. Design of study Cross-sectional analyses of routinely available data. Setting Urban general practice in mainland Scotland. Method QOF points and disease prevalence were obtained for all urban general practices in Scotland (n = 638) and linked to data on the practice, GP and patient population. The relationship between QOF point attainment, disease prevalence and practice size was examined using univariate statistical analyses. Results Smaller practices were more likely to be located in areas of socioeconomic deprivation; had patients with poorer health; and were less likely to participate in voluntary practice-based quality schemes. Overall, smaller practices received fewer QOF points compared to larger practices (P = 0.003), due to lower point attainment in the organisational domain (P = 0.002). There were no differences across practice size in the other domains of the QOF, including clinical care. Smaller practices reported higher levels of chronic obstructive pulmonary disease (COPD) and mental health conditions and lower levels of asthma, epilepsy and hypothyroidism. There was no difference in the reported prevalence of hypertension or coronary heart disease (CHD) across practices, in contrast to CHD mortality for patients aged under 70 years, where the mortality rate was 40% greater for single-handed practices compared with large practices. Conclusions Although smaller practices obtained fewer points than larger practices under the QOF, this was due to lower scores in the organisational domain of the contract rather than to lower scores for clinical care. Single-handed practices, in common

  12. Health Education Specialist Practice Analysis 2015 (HESPA 2015): Process and Outcomes.

    PubMed

    McKenzie, James F; Dennis, Dixie; Auld, M Elaine; Lysoby, Linda; Doyle, Eva; Muenzen, Patricia M; Caro, Carla M; Kusorgbor-Narh, Cynthia S

    2016-06-01

    The Health Education Specialist Practice Analysis 2015 (HESPA 2015) was conducted to update and validate the Areas of Responsibilities, Competencies, and Sub-competencies for Entry- and Advanced-Level Health Education Specialists. Two data collection instruments were developed-one was focused on Sub-competencies and the other on knowledge items related to the practice of health education. Instruments were administered to health education specialists (N = 3,152) using online survey methods. A total of 2,508 survey participants used 4-point ordinal scales to rank Sub-competencies by frequency of use and importance. The other 644 participants used the same 4-point frequency scale to rank related knowledge items. Composite scores for Sub-competencies were calculated and subgroup comparisons were conducted that resulted in the validation of 7 Areas of Responsibilities, 36 Competencies, and 258 Sub-competencies. Of the Sub-competencies, 141 were identified as Entry-level, 76 Advanced 1-level, and 41 Advanced 2-level. In addition, 131 knowledge items were verified. The HESPA 2015 findings are compared with the results of the Health Education Job Analysis 2010 and will be useful to those involved in professional preparation, continuing education, and employment of health education specialists.

  13. Comparison and cost analysis of drinking water quality monitoring requirements versus practice in seven developing countries.

    PubMed

    Crocker, Jonny; Bartram, Jamie

    2014-07-18

    Drinking water quality monitoring programs aim to support provision of safe drinking water by informing water quality management. Little evidence or guidance exists on best monitoring practices for low resource settings. Lack of financial, human, and technological resources reduce a country's ability to monitor water supply. Monitoring activities were characterized in Cambodia, Colombia, India (three states), Jordan, Peru, South Africa, and Uganda according to water sector responsibilities, monitoring approaches, and marginal cost. The seven study countries were selected to represent a range of low resource settings. The focus was on monitoring of microbiological parameters, such as E. coli, coliforms, and H2S-producing microorganisms. Data collection involved qualitative and quantitative methods. Across seven study countries, few distinct approaches to monitoring were observed, and in all but one country all monitoring relied on fixed laboratories for sample analysis. Compliance with monitoring requirements was highest for operational monitoring of large water supplies in urban areas. Sample transport and labor for sample collection and analysis together constitute approximately 75% of marginal costs, which exclude capital costs. There is potential for substantive optimization of monitoring programs by considering field-based testing and by fundamentally reconsidering monitoring approaches for non-piped supplies. This is the first study to look quantitatively at water quality monitoring practices in multiple developing countries.

  14. Comparison and Cost Analysis of Drinking Water Quality Monitoring Requirements versus Practice in Seven Developing Countries

    PubMed Central

    Crocker, Jonny; Bartram, Jamie

    2014-01-01

    Drinking water quality monitoring programs aim to support provision of safe drinking water by informing water quality management. Little evidence or guidance exists on best monitoring practices for low resource settings. Lack of financial, human, and technological resources reduce a country’s ability to monitor water supply. Monitoring activities were characterized in Cambodia, Colombia, India (three states), Jordan, Peru, South Africa, and Uganda according to water sector responsibilities, monitoring approaches, and marginal cost. The seven study countries were selected to represent a range of low resource settings. The focus was on monitoring of microbiological parameters, such as E. coli, coliforms, and H2S-producing microorganisms. Data collection involved qualitative and quantitative methods. Across seven study countries, few distinct approaches to monitoring were observed, and in all but one country all monitoring relied on fixed laboratories for sample analysis. Compliance with monitoring requirements was highest for operational monitoring of large water supplies in urban areas. Sample transport and labor for sample collection and analysis together constitute approximately 75% of marginal costs, which exclude capital costs. There is potential for substantive optimization of monitoring programs by considering field-based testing and by fundamentally reconsidering monitoring approaches for non-piped supplies. This is the first study to look quantitatively at water quality monitoring practices in multiple developing countries. PMID:25046632

  15. EMQN Best Practice Guidelines for molecular and haematology methods for carrier identification and prenatal diagnosis of the haemoglobinopathies.

    PubMed

    Traeger-Synodinos, Joanne; Harteveld, Cornelis L; Old, John M; Petrou, Mary; Galanello, Renzo; Giordano, Piero; Angastioniotis, Michael; De la Salle, Barbara; Henderson, Shirley; May, Alison

    2015-04-01

    Haemoglobinopathies constitute the commonest recessive monogenic disorders worldwide, and the treatment of affected individuals presents a substantial global disease burden. Carrier identification and prenatal diagnosis represent valuable procedures that identify couples at risk for having affected children, so that they can be offered options to have healthy offspring. Molecular diagnosis facilitates prenatal diagnosis and definitive diagnosis of carriers and patients (especially 'atypical' cases who often have complex genotype interactions). However, the haemoglobin disorders are unique among all genetic diseases in that identification of carriers is preferable by haematological (biochemical) tests rather than DNA analysis. These Best Practice guidelines offer an overview of recommended strategies and methods for carrier identification and prenatal diagnosis of haemoglobinopathies, and emphasize the importance of appropriately applying and interpreting haematological tests in supporting the optimum application and evaluation of globin gene DNA analysis.

  16. EMQN Best Practice Guidelines for molecular and haematology methods for carrier identification and prenatal diagnosis of the haemoglobinopathies

    PubMed Central

    Traeger-Synodinos, Joanne; Harteveld, Cornelis L; Old, John M; Petrou, Mary; Galanello, Renzo; Giordano, Piero; Angastioniotis, Michael; De la Salle, Barbara; Henderson, Shirley; May, Alison

    2015-01-01

    Haemoglobinopathies constitute the commonest recessive monogenic disorders worldwide, and the treatment of affected individuals presents a substantial global disease burden. Carrier identification and prenatal diagnosis represent valuable procedures that identify couples at risk for having affected children, so that they can be offered options to have healthy offspring. Molecular diagnosis facilitates prenatal diagnosis and definitive diagnosis of carriers and patients (especially ‘atypical' cases who often have complex genotype interactions). However, the haemoglobin disorders are unique among all genetic diseases in that identification of carriers is preferable by haematological (biochemical) tests rather than DNA analysis. These Best Practice guidelines offer an overview of recommended strategies and methods for carrier identification and prenatal diagnosis of haemoglobinopathies, and emphasize the importance of appropriately applying and interpreting haematological tests in supporting the optimum application and evaluation of globin gene DNA analysis. PMID:25052315

  17. Practical method using superposition of individual magnetic fields for initial arrangement of undulator magnets

    SciTech Connect

    Tsuchiya, K.; Shioya, T.

    2015-04-15

    We have developed a practical method for determining an excellent initial arrangement of magnetic arrays for a pure-magnet Halbach-type undulator. In this method, the longitudinal magnetic field distribution of each magnet is measured using a moving Hall probe system along the beam axis with a high positional resolution. The initial arrangement of magnetic arrays is optimized and selected by analyzing the superposition of all distribution data in order to achieve adequate spectral quality for the undulator. We applied this method to two elliptically polarizing undulators (EPUs), called U#16-2 and U#02-2, at the Photon Factory storage ring (PF ring) in the High Energy Accelerator Research Organization (KEK). The measured field distribution of the undulator was demonstrated to be excellent for the initial arrangement of the magnet array, and this method saved a great deal of effort in adjusting the magnetic fields of EPUs.

  18. Flutter and Divergence Analysis using the Generalized Aeroelastic Analysis Method

    NASA Technical Reports Server (NTRS)

    Edwards, John W.; Wieseman, Carol D.

    2003-01-01

    The Generalized Aeroelastic Analysis Method (GAAM) is applied to the analysis of three well-studied checkcases: restrained and unrestrained airfoil models, and a wing model. An eigenvalue iteration procedure is used for converging upon roots of the complex stability matrix. For the airfoil models, exact root loci are given which clearly illustrate the nature of the flutter and divergence instabilities. The singularities involved are enumerated, including an additional pole at the origin for the unrestrained airfoil case and the emergence of an additional pole on the positive real axis at the divergence speed for the restrained airfoil case. Inconsistencies and differences among published aeroelastic root loci and the new, exact results are discussed and resolved. The generalization of a Doublet Lattice Method computer code is described and the code is applied to the calculation of root loci for the wing model for incompressible and for subsonic flow conditions. The error introduced in the reduction of the singular integral equation underlying the unsteady lifting surface theory to a linear algebraic equation is discussed. Acknowledging this inherent error, the solutions of the algebraic equation by GAAM are termed 'exact.' The singularities of the problem are discussed and exponential series approximations used in the evaluation of the kernel function shown to introduce a dense collection of poles and zeroes on the negative real axis. Again, inconsistencies and differences among published aeroelastic root loci and the new 'exact' results are discussed and resolved. In all cases, aeroelastic flutter and divergence speeds and frequencies are in good agreement with published results. The GAAM solution procedure allows complete control over Mach number, velocity, density, and complex frequency. Thus all points on the computed root loci can be matched-point, consistent solutions without recourse to complex mode tracking logic or dataset interpolation, as in the k and p

  19. Ad hoc supervision of general practice registrars as a 'community of practice': analysis, interpretation and re-presentation.

    PubMed

    Clement, T; Brown, J; Morrison, J; Nestel, D

    2016-05-01

    General practice registrars in Australia undertake most of their vocational training in accredited general practices. They typically see patients alone from the start of their community-based training and are expected to seek timely ad hoc support from their supervisor. Such ad hoc encounters are a mechanism for ensuring patient safety, but also provide an opportunity for learning and teaching. Wenger's (Communities of practice: learning, meaning, and identity. Cambridge University Press, New York, 1998) social theory of learning ('communities of practice') guided a secondary analysis of audio-recordings of ad hoc encounters. Data from one encounter is re-presented as an extended sequence to maintain congruence with the theoretical perspective and enhance vicariousness. An interpretive commentary communicates key features of Wenger's theory and highlights the researchers' interpretations. We argue that one encounter can reveal universal understandings of clinical supervision and that the process of naturalistic generalisation allows readers to transfer others' experiences to their own contexts. The paper raises significant analytic, interpretive, and representational issues. We highlight that report writing is an important, but infrequently discussed, part of research design. We discuss the challenges of supporting the learning and teaching that arises from adopting a socio-cultural lens and argue that such a perspective importantly captures the complex range of issues that work-based practitioners have to grapple with. This offers a challenge to how we research and seek to influence work-based learning and teaching in health care settings.

  20. Exploring the Current Landscape of Intravenous Infusion Practices and Errors (ECLIPSE): protocol for a mixed-methods observational study

    PubMed Central

    Blandford, Ann; Furniss, Dominic; Chumbley, Gill; Iacovides, Ioanna; Wei, Li; Cox, Anna; Mayer, Astrid; Schnock, Kumiko; Bates, David Westfall; Dykes, Patricia C; Bell, Helen; Dean Franklin, Bryony

    2016-01-01

    Introduction Intravenous medication is essential for many hospital inpatients. However, providing intravenous therapy is complex and errors are common. ‘Smart pumps’ incorporating dose error reduction software have been widely advocated to reduce error. However, little is known about their effect on patient safety, how they are used or their likely impact. This study will explore the landscape of intravenous medication infusion practices and errors in English hospitals and how smart pumps may relate to the prevalence of medication administration errors. Methods and analysis This is a mixed-methods study involving an observational quantitative point prevalence study to determine the frequency and types of errors that occur in the infusion of intravenous medication, and qualitative interviews with hospital staff to better understand infusion practices and the contexts in which errors occur. The study will involve 5 clinical areas (critical care, general medicine, general surgery, paediatrics and oncology), across 14 purposively sampled acute hospitals and 2 paediatric hospitals to cover a range of intravenous infusion practices. Data collectors will compare each infusion running at the time of data collection against the patient's medication orders to identify any discrepancies. The potential clinical importance of errors will be assessed. Quantitative data will be analysed descriptively; interviews will be analysed using thematic analysis. Ethics and dissemination Ethical approval has been obtained from an NHS Research Ethics Committee (14/SC/0290); local approvals will be sought from each participating organisation. Findings will be published in peer-reviewed journals and presented at conferences for academic and health professional audiences. Results will also be fed back to participating organisations to inform local policy, training and procurement. Aggregated findings will inform the debate on costs and benefits of the NHS investing in smart pump technology

  1. A Novel Method for Dissolved Phosphorus Analysis

    NASA Astrophysics Data System (ADS)

    Berry, J. M.; Spiese, C. E.

    2012-12-01

    High phosphorus loading is a major problem in the Great Lakes watershed. Phosphate enters waterways via both point and non-point sources (e.g., runoff, tile drainage, etc.), promoting eutrophication, and ultimately leading to algal blooms, hypoxia and loss of aquatic life. Quantification of phosphorus loading is typically done using the molybdenum blue method, which is known to have significant drawbacks. The molybdenum blue method requires strict control on time, involves toxic reagents that have limited shelf-life, and is generally unable to accurately measure sub-micromolar concentrations. This study aims to develop a novel reagent that will overcome many of these problems. Ethanolic europium(III) chloride and 8-hydroxyquinoline-5-sulfonic acid (hqs) were combined to form the bis-hqs complex (Eu-hqs). Eu-hqs was synthesized as the dipotassium salt via a simple one-pot procedure. This complex was found to be highly fluorescent (λex = 360 nm, λem = 510 nm) and exhibited a linear response upon addition of monohydrogen phosphate. The linear response ranged from 0.5 - 25 μM HPO42- (15.5 - 775 μg P L-1). It was also determined that Eu-hqs formed a 1:1 complex with phosphate. Maximum fluorescence was found at a pH of 8.50, and few interferences from other ions were found. Shelf-life of the reagent was at least one month, twice as long as most of the molybdenum blue reagent formulations. In the future, field tests will be undertaken in local rivers, lakes, and wetlands to determine the applicability of the complex to real-world analysis.

  2. Empirical Analysis of Green Supply Chain Management Practices in Indian Automobile Industry

    NASA Astrophysics Data System (ADS)

    Luthra, S.; Garg, D.; Haleem, A.

    2014-04-01

    Environmental sustainability and green environmental issues have an increasing popularity among researchers and supply chain practitioners. An attempt has been made to identify and empirically analyze green supply chain management (GSCM) practices in Indian automobile industry. Six main GSCM practices (having 37 sub practices) and four expected performance outcomes (having 16 performances) have been identified by implementing GSCM practices from literature review. Questionnaire based survey has been made to validate these practices and performance outcomes. 123 complete questionnaires were collected from Indian automobile organizations and used for empirical analysis of GSCM practices in Indian automobile industry. Descriptive statistics have been used to know current implementation status of GSCM practices in Indian automobile industry and multiple regression analysis has been carried out to know the impact on expected organizational performance outcomes by current GSCM practices adopted by Indian automobile industry. The results of study suggested that environmental, economic, social and operational performances improve with the implementation of GSCM practices. This paper may play an important role to understand various GSCM implementation issues and help practicing managers to improve their performances in the supply chain.

  3. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-05-25

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  4. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-11-23

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  5. Unsaturated Shear Strength and Numerical Analysis Methods for Unsaturated Soils

    NASA Astrophysics Data System (ADS)

    Kim, D.; Kim, G.; Kim, D.; Baek, H.; Kang, S.

    2011-12-01

    The angles of shearing resistance(φb) and internal friction(φ') appear to be identical in low suction range, but the angle of shearing resistance shows non-linearity as suction increases. In most numerical analysis however, a fixed value for the angle of shearing resistance is applied even in low suction range for practical reasons, often leading to a false conclusion. In this study, a numerical analysis has been undertaken employing the estimated shear strength curve of unsaturated soils from the residual water content of SWCC proposed by Vanapalli et al.(1996). The result was also compared with that from a fixed value of φb. It is suggested that, in case it is difficult to measure the unsaturated shear strength curve through the triaxial soil tests, the estimated shear strength curve using the residual water content can be a useful alternative. This result was applied for analyzing the slope stablity of unsaturated soils. The effects of a continuous rainfall on slope stability were analyzed using a commercial program "SLOPE/W", with the coupled infiltration analysis program "SEEP/W" from the GEO-SLOPE International Ltd. The results show that, prior to the infiltration by the intensive rainfall, the safety factors using the estimated shear strength curve were substantially higher than that from the fixed value of φb at all time points. After the intensive infiltration, both methods showed a similar behavior.

  6. Introducing and Integrating Gifted Education into an Existing Independent School: An Analysis of Practice

    ERIC Educational Resources Information Center

    McKibben, Stephen

    2013-01-01

    In this analysis of practice, I conduct a combination formative and summative program evaluation of an initiative introduced to serve gifted learners at The Ocean School (TOS), an independent, Pre-K-grade 8 day school located in a rural area of the West Coast. Using the best practices as articulated by the National Association of Gifted Children…

  7. Reporting Practices in Confirmatory Factor Analysis: An Overview and Some Recommendations

    ERIC Educational Resources Information Center

    Jackson, Dennis L.; Gillaspy, J. Arthur, Jr.; Purc-Stephenson, Rebecca

    2009-01-01

    Reporting practices in 194 confirmatory factor analysis studies (1,409 factor models) published in American Psychological Association journals from 1998 to 2006 were reviewed and compared with established reporting guidelines. Three research questions were addressed: (a) how do actual reporting practices compare with published guidelines? (b) how…

  8. Common Goals for the Science and Practice of Behavior Analysis: A Response to Critchfield

    ERIC Educational Resources Information Center

    Schneider, Susan M.

    2012-01-01

    In his scholarly and thoughtful article, "Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis," Critchfield (2011) discussed the science-practice frictions to be expected in any professional organization that attempts to combine these interests. He suggested that the Association for Behavior Analysis…

  9. Nursing Faculty Decision Making about Best Practices in Test Construction, Item Analysis, and Revision

    ERIC Educational Resources Information Center

    Killingsworth, Erin Elizabeth

    2013-01-01

    With the widespread use of classroom exams in nursing education there is a great need for research on current practices in nursing education regarding this form of assessment. The purpose of this study was to explore how nursing faculty members make decisions about using best practices in classroom test construction, item analysis, and revision in…

  10. Effect of practice management softwares among physicians of developing countries with special reference to Indian scenario by Mixed Method Technique

    PubMed Central

    Davey, Sanjeev; Davey, Anuradha

    2015-01-01

    Introduction: Currently, many cheaper “practice management software” (PMS) are available in developing countries including India; despite their availability and benefits, its penetration and usage vary from low to moderate level, justifying the importance of this study area. Materials and Methods: First preferred reporting items for systematic-review and meta-analysis (2009) guidelines were considered; followed by an extensive systematic-review of available studies in literature related to developing countries, on key search term from main abstracting databases: PubMed, EMBASE, EBSCO, BIO-MED Central, Cochrane Library, world CAT-library till 15 June 2014; where any kind of article whether published or unpublished, in any sort or form or any language indicating the software usage were included. Thereafter, meta-analysis on Indian studies revealing the magnitude of usage in Indian scenario by Open Meta-(analyst) software using binary random effects (REs) model was done. Studies from developed countries were excluded in our study. Results: Of 57 studies included in a systematic review from developing countries, only 4 Indian studies were found eligible for meta-analysis. RE model revealed although not-significant results (total participants = 243,526; range: 100–226,228, overall odds ratio = 2.85, 95% confidence interval = P < 0.05 and tests for heterogeneity: Q [df = 3] = 0.8 Het. P = 0.85). The overall magnitude of usage of PMS on Indian physicians practice was however found between 10% and 45%. Conclusion: Although variable and nonsignificant effect of usage of PM software on practice of physicians in developing countries like India was found; there is a need to recognize the hidden potential of this system. Hence, more in-depth research in future needs to be done, in order to find a real impact of this system. PMID:25949969

  11. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture, milkfat, and phosphatase levels in cheeses will be determined by the following methods of analysis...

  12. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture, milkfat, and phosphatase levels in cheeses will be determined by the following methods of analysis...

  13. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture, milkfat, and phosphatase levels in cheeses will be determined by the following methods of analysis...

  14. Factor Analysis in Counseling Psychology Research, Training, and Practice: Principles, Advances, and Applications

    ERIC Educational Resources Information Center

    Kahn, Jeffrey H.

    2006-01-01

    Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) have contributed to test development and validation in counseling psychology, but additional applications have not been fully realized. The author presents an overview of the goals, terminology, and procedures of factor analysis; reviews best practices for extracting,…

  15. Language Ideology or Language Practice? An Analysis of Language Policy Documents at Swedish Universities

    ERIC Educational Resources Information Center

    Björkman, Beyza

    2014-01-01

    This article presents an analysis and interpretation of language policy documents from eight Swedish universities with regard to intertextuality, authorship and content analysis of the notions of language practices and English as a lingua franca (ELF). The analysis is then linked to Spolsky's framework of language policy, namely language…

  16. Visual cluster analysis and pattern recognition methods

    DOEpatents

    Osbourn, Gordon Cecil; Martinez, Rubel Francisco

    2001-01-01

    A method of clustering using a novel template to define a region of influence. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques.

  17. Echinacea purpurea: Pharmacology, phytochemistry and analysis methods.

    PubMed

    Manayi, Azadeh; Vazirian, Mahdi; Saeidnia, Soodabeh

    2015-01-01

    Echinacea purpurea (Asteraceae) is a perennial medicinal herb with important immunostimulatory and anti-inflammatory properties, especially the alleviation of cold symptoms. The plant also attracted scientists' attention to assess other aspects of its beneficial effects. For instance, antianxiety, antidepression, cytotoxicity, and antimutagenicity as induced by the plant have been revealed in various studies. The findings of the clinical trials are controversial in terms of side effects. While some studies revealed the beneficial effects of the plant on the patients and no severe adverse effects, some others have reported serious side effects including abdominal pain, angioedema, dyspnea, nausea, pruritus, rash, erythema, and urticaria. Other biological activities of the plant such as antioxidant, antibacterial, antiviral, and larvicidal activities have been reported in previous experimental studies. Different classes of secondary metabolites of the plant such as alkamides, caffeic acid derivatives, polysaccharides, and glycoproteins are believed to be biologically and pharmacologically active. Actually, concurrent determination and single analysis of cichoric acid and alkamides have been successfully developed mainly by using high-performance liquid chromatography (HPLC) coupled with different detectors including UV spectrophotometric, coulometric electrochemical, and electrospray ionization mass spectrometric detectors. The results of the studies which were controversial revealed that in spite of major experiments successfully accomplished using E. purpurea, many questions remain unanswered and future investigations may aim for complete recognition of the plant's mechanism of action using new, complementary methods. PMID:26009695

  18. Echinacea purpurea: Pharmacology, phytochemistry and analysis methods

    PubMed Central

    Manayi, Azadeh; Vazirian, Mahdi; Saeidnia, Soodabeh

    2015-01-01

    Echinacea purpurea (Asteraceae) is a perennial medicinal herb with important immunostimulatory and anti-inflammatory properties, especially the alleviation of cold symptoms. The plant also attracted scientists’ attention to assess other aspects of its beneficial effects. For instance, antianxiety, antidepression, cytotoxicity, and antimutagenicity as induced by the plant have been revealed in various studies. The findings of the clinical trials are controversial in terms of side effects. While some studies revealed the beneficial effects of the plant on the patients and no severe adverse effects, some others have reported serious side effects including abdominal pain, angioedema, dyspnea, nausea, pruritus, rash, erythema, and urticaria. Other biological activities of the plant such as antioxidant, antibacterial, antiviral, and larvicidal activities have been reported in previous experimental studies. Different classes of secondary metabolites of the plant such as alkamides, caffeic acid derivatives, polysaccharides, and glycoproteins are believed to be biologically and pharmacologically active. Actually, concurrent determination and single analysis of cichoric acid and alkamides have been successfully developed mainly by using high-performance liquid chromatography (HPLC) coupled with different detectors including UV spectrophotometric, coulometric electrochemical, and electrospray ionization mass spectrometric detectors. The results of the studies which were controversial revealed that in spite of major experiments successfully accomplished using E. purpurea, many questions remain unanswered and future investigations may aim for complete recognition of the plant's mechanism of action using new, complementary methods. PMID:26009695

  19. Knowledge, attitude and practice related to infant feeding among women in rural Papua New Guinea: a descriptive, mixed method study

    PubMed Central

    2013-01-01

    Background Despite the well-recognized effectiveness of exclusive breastfeeding for the first six months of an infant life for reducing infant mortality, adherence to this practice is not widespread in the developing world. Although several studies on infant nutrition practices have been conducted in urban settings of Papua New Guinea (PNG), there is only scant information on infant feeding practices in rural settings. Therefore, this study aimed to investigate knowledge, attitude and practice associated with exclusive breastfeeding in various locations in rural PNG. Methods A mixed method study using interviews based on a semi-structured questionnaire (n = 140) and Focus Group Discussions (FGDs) was conducted among mothers in rural PNG between August and September 2012. Participants were selected using convenience sampling. Included in the study were both primiparous and multiparous mothers with a child below the age of two years. Content analysis was used for qualitative data and descriptive statistics were used for quantitative data. Results Whereas most women indicated breastfeeding as a better way to feed babies, knowledge of the reasons for its superiority over infant formula was generally poor. Only 17% of mothers practiced exclusive breastfeeding for the first six months postpartum. Our study showed that the size of the gap between exclusive breastfeeding practice and global recommendations was striking. Taking into account the low educational profile of the participants, the disparity may be explained by the fact that most of the mothers in this study had no formal education on infant feeding. Conclusions This study showed a lack of understanding of the importance of and poor adherence to exclusive breastfeeding for the first six months postpartum among rural mothers. As exclusive breastfeeding promotion has been proved to be one of most effective ways to improve infant survival, more attention should be given to it, especially targeting the large

  20. Benthic macroinvertebrates in lake ecological assessment: A review of methods, intercalibration and practical recommendations.

    PubMed

    Poikane, Sandra; Johnson, Richard K; Sandin, Leonard; Schartau, Ann Kristin; Solimini, Angelo G; Urbanič, Gorazd; Arbačiauskas, Kęstutis; Aroviita, Jukka; Gabriels, Wim; Miler, Oliver; Pusch, Martin T; Timm, Henn; Böhmer, Jürgen

    2016-02-01

    Legislation in Europe has been adopted to determine and improve the ecological integrity of inland and coastal waters. Assessment is based on four biotic groups, including benthic macroinvertebrate communities. For lakes, benthic invertebrates have been recognized as one of the most difficult organism groups to use in ecological assessment, and hitherto their use in ecological assessment has been limited. In this study, we review and intercalibrate 13 benthic invertebrate-based tools across Europe. These assessment tools address different human impacts: acidification (3 methods), eutrophication (3 methods), morphological alterations (2 methods), and a combination of the last two (5 methods). For intercalibration, the methods were grouped into four intercalibration groups, according to the habitat sampled and putative pressure. Boundaries of the 'good ecological status' were compared and harmonized using direct or indirect comparison approaches. To enable indirect comparison of the methods, three common pressure indices and two common biological multimetric indices were developed for larger geographical areas. Additionally, we identified the best-performing methods based on their responsiveness to different human impacts. Based on these experiences, we provide practical recommendations for the development and harmonization of benthic invertebrate assessment methods in lakes and similar habitats. PMID:26580734

  1. Benthic macroinvertebrates in lake ecological assessment: A review of methods, intercalibration and practical recommendations.

    PubMed

    Poikane, Sandra; Johnson, Richard K; Sandin, Leonard; Schartau, Ann Kristin; Solimini, Angelo G; Urbanič, Gorazd; Arbačiauskas, Kęstutis; Aroviita, Jukka; Gabriels, Wim; Miler, Oliver; Pusch, Martin T; Timm, Henn; Böhmer, Jürgen

    2016-02-01

    Legislation in Europe has been adopted to determine and improve the ecological integrity of inland and coastal waters. Assessment is based on four biotic groups, including benthic macroinvertebrate communities. For lakes, benthic invertebrates have been recognized as one of the most difficult organism groups to use in ecological assessment, and hitherto their use in ecological assessment has been limited. In this study, we review and intercalibrate 13 benthic invertebrate-based tools across Europe. These assessment tools address different human impacts: acidification (3 methods), eutrophication (3 methods), morphological alterations (2 methods), and a combination of the last two (5 methods). For intercalibration, the methods were grouped into four intercalibration groups, according to the habitat sampled and putative pressure. Boundaries of the 'good ecological status' were compared and harmonized using direct or indirect comparison approaches. To enable indirect comparison of the methods, three common pressure indices and two common biological multimetric indices were developed for larger geographical areas. Additionally, we identified the best-performing methods based on their responsiveness to different human impacts. Based on these experiences, we provide practical recommendations for the development and harmonization of benthic invertebrate assessment methods in lakes and similar habitats.

  2. Methods of the computer-aided statistical analysis of microcircuits

    NASA Astrophysics Data System (ADS)

    Beliakov, Iu. N.; Kurmaev, F. A.; Batalov, B. V.

    Methods that are currently used for the computer-aided statistical analysis of microcircuits at the design stage are summarized. In particular, attention is given to methods for solving problems in statistical analysis, statistical planning, and factorial model synthesis by means of irregular experimental design. Efficient ways of reducing the computer time required for statistical analysis and numerical methods of microcircuit analysis are proposed. The discussion also covers various aspects of the organization of computer-aided microcircuit modeling and analysis systems.

  3. Grid Analysis and Display System (GrADS): A practical tool for earth science visualization

    NASA Technical Reports Server (NTRS)

    Kinter, James L., III; Doty, Brian E.

    1991-01-01

    Viewgraphs on grid analysis and display system (GrADS): a practical tool for earth science visualization are presented. Topics covered include: GrADS design goals; data sets; and temperature profiles.

  4. Trends in vasectomy. Analysis of one teaching practice.

    PubMed Central

    Reynolds, J. L.

    1998-01-01

    PROBLEM BEING ADDRESSED: How can a teaching practice develop a referral service and incorporate educational opportunities for family medicine residents, clinical clerks, and community family physicians? OBJECTIVE OF PROGRAM: To develop a high-quality vasectomy service within a teaching practice to change the surgical procedure to the no-scalpel vasectomy (NSV) technique; to educate family medicine residents, clinical clerks, and community family physicians about vasectomy and the NSV technique; and to monitor outcomes and compare them with published results. MAIN COMPONENTS OF PROGRAM: The program took place in an urban family medicine residency program. Data on number of procedures, types of patients choosing vasectomy, and outcomes are presented, along with information on number of learners who viewed, assisted with, or became competent to perform NSV. CONCLUSIONS: A few family medicine residents and some interested community physicians could be trained to perform NSV competently. Involving learners in the procedure does not seem to change the rate of complications. Images Figure 1 PMID:9559195

  5. Some Practical Guidelines for Teaching Dramatic Analysis to Beginning Students

    ERIC Educational Resources Information Center

    Pelias, Ronald J.; Ralph, Stephen D.

    1985-01-01

    Outlines common abuses that occur when students first use dramatic analysis in oral interpretation. Offers guidelines to help make students' efforts more productive; uses William Carlos Williams's poem "The Red Wheelbarrow" as an example. (PD)

  6. Using task analysis to generate evidence for strengthening midwifery education, practice, and regulation in Ethiopia

    PubMed Central

    Yigzaw, Tegbar; Carr, Catherine; Stekelenburg, Jelle; van Roosmalen, Jos; Gibson, Hannah; Gelagay, Mintwab; Admassu, Azeb

    2016-01-01

    Purpose Realizing aspirations for meeting the global reproductive, maternal, newborn, and child health goals depends not only on increasing the numbers but also on improving the capability of midwifery workforce. We conducted a task analysis study to identify the needs for strengthening the midwifery workforce in Ethiopia. Methods We conducted a cross-sectional study of recently qualified midwives in Ethiopia. Purposively selected participants from representative geographic and practice settings completed a self-administered questionnaire, making judgments about the frequency of performance, criticality, competence, and location of training for a list of validated midwifery tasks. Using Statistical Package for the Social Sciences, Version 20, we computed the percentages and averages to describe participant and practice characteristics. We identified priority preservice education gaps by considering the tasks least frequently learned in preservice, most frequently mentioned for not being trained, and had the highest not capable response. Identification of top priorities for in-service training considered tasks with highest “not capable” and “never” done responses. We determined the licensing exam blueprint by weighing the composite mean scores for frequency and criticality variables and expert rating across practice categories. Results One hundred and thirty-eight midwives participated in the study. The majority of respondents recognized the importance of midwifery tasks (89%), felt they were capable (91.8%), reported doing them frequently (63.9%), and learned them during preservice education (56.3%). We identified competence gaps in tasks related to obstetric complications, gynecology, public health, professional duties, and prevention of mother to child transmission of HIV. Moreover, our study helped to determine composition of the licensing exam for university graduates. Conclusion The task analysis indicates that midwives provide critical reproductive

  7. Method Development for Analysis of Aspirin Tablets.

    ERIC Educational Resources Information Center

    Street, Kenneth W., Jr.

    1988-01-01

    Develops a lab experiment for introductory instrumental analysis that requires interference studies and optimizing of conditions. Notes the analysis of the aspirin is by visible spectrophotometric assay. Gives experimental details and discussion. (MVL)

  8. [Development of rapid methods for quantitative analysis of proteolytic reactions].

    PubMed

    Beloivan, O A; Tsvetkova, M N; Bubriak, O A

    2002-01-01

    The approaches for development of express methods for quantitative control of proteolytic reactions are discussed. Recently, these reactions have taken on special significance for revealing many important problems of theoretical and practical medicine and biology as well as for technological, pharmacological and ecological monitoring. Traditional methods can be improved both by use of immobilized enzymes and substrates, and on the basis of combination of various classic biochemical and immunological approaches. The synthesis of substrates with specified properties allows new methods to be realized for the study of the proteinase activity and kinetic characteristics of the corresponding reactions both in vitro and in vivo. An application of biosensor technology is promising trend since it allows the analysis time and cost to be saved, the direct interaction between enzymes and their inhibitors and activators to be studied in a real time scale, the quantitative measurements to be performed both in liquids and in the air. Besides, biosensor technique is well compatible with computer data processing. PMID:12924013

  9. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Marketing Service, Dairy Programs, or the Official Methods of Analysis of the Association of Official... 7 Agriculture 3 2011-01-01 2011-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable...

  10. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL, as issued by the USDA,...

  11. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL, as issued by the USDA,...

  12. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL as issued by the USDA, Agricultural...

  13. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL as issued by the USDA, Agricultural...

  14. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Methods of analysis. 133.5 Section 133.5 Food and... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture, milkfat, and phosphatase levels in cheeses will be determined by the following methods of analysis...

  15. Concurrent implementation of the Crank-Nicolson method for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Ransom, J. B.; Fulton, R. E.

    1985-01-01

    To exploit the significant gains in computing speed provided by Multiple Instruction Multiple Data (MIMD) computers, concurrent methods for practical problems need to be investigated and test problems implemented on actual hardware. One such problem class is heat transfer analysis which is important in many aerospace applications. This paper compares the efficiency of two alternate implementations of heat transfer analysis on an experimental MIMD computer called the Finite Element Machine (FEM). The implicit Crank-Nicolson method is used to solve concurrently the heat transfer equations by both iterative and direct methods. Comparison of actual timing results achieved for the two methods and their significance relative to more complex problems are discussed.

  16. Degradation of learned skills. Effectiveness of practice methods on simulated space flight skill retention

    NASA Technical Reports Server (NTRS)

    Sitterley, T. E.; Berge, W. A.

    1972-01-01

    Manual flight control and emergency procedure task skill degradation was evaluated after time intervals of from 1 to 6 months. The tasks were associated with a simulated launch through the orbit insertion flight phase of a space vehicle. The results showed that acceptable flight control performance was retained for 2 months, rapidly deteriorating thereafter by a factor of 1.7 to 3.1 depending on the performance measure used. Procedural task performance showed unacceptable degradation after only 1 month, and exceeded an order of magnitude after 4 months. The effectiveness of static rehearsal (checklists and briefings) and dynamic warmup (simulator practice) retraining methods were compared for the two tasks. Static rehearsal effectively countered procedural skill degradation, while some combination of dynamic warmup appeared necessary for flight control skill retention. It was apparent that these differences between methods were not solely a function of task type or retraining method, but were a function of the performance measures used for each task.

  17. The uniform asymptotic swallowtail approximation - Practical methods for oscillating integrals with four coalescing saddle points

    NASA Technical Reports Server (NTRS)

    Connor, J. N. L.; Curtis, P. R.; Farrelly, D.

    1984-01-01

    Methods that can be used in the numerical implementation of the uniform swallowtail approximation are described. An explicit expression for that approximation is presented to the lowest order, showing that there are three problems which must be overcome in practice before the approximation can be applied to any given problem. It is shown that a recently developed quadrature method can be used for the accurate numerical evaluation of the swallowtail canonical integral and its partial derivatives. Isometric plots of these are presented to illustrate some of their properties. The problem of obtaining the arguments of the swallowtail integral from an analytical function of its argument is considered, describing two methods of solving this problem. The asymptotic evaluation of the butterfly canonical integral is addressed.

  18. A Comparison of Low and High Structure Practice for Learning Interactional Analysis Skills

    ERIC Educational Resources Information Center

    Davis, Matthew James

    2011-01-01

    Innovative training approaches in work domains such as professional athletics, aviation, and the military have shown that specific types of practice can reliably lead to higher levels of performance for the average professional. This study describes the development of an initial effort toward creating a similar practice method for psychotherapy…

  19. Spelling Practice Intervention: A Comparison of Tablet PC and Picture Cards as Spelling Practice Methods for Students with Developmental Disabilities

    ERIC Educational Resources Information Center

    Seok, Soonhwa; DaCosta, Boaventura; Yu, Byeong Min

    2015-01-01

    The present study compared a spelling practice intervention using a tablet personal computer (PC) and picture cards with three students diagnosed with developmental disabilities. An alternating-treatments design with a non-concurrent multiple-baseline across participants was used. The aims of the present study were: (a) to determine if…

  20. Meta-research: Evaluation and Improvement of Research Methods and Practices.

    PubMed

    Ioannidis, John P A; Fanelli, Daniele; Dunne, Debbie Drake; Goodman, Steven N

    2015-10-01

    As the scientific enterprise has grown in size and diversity, we need empirical evidence on the research process to test and apply interventions that make it more efficient and its results more reliable. Meta-research is an evolving scientific discipline that aims to evaluate and improve research practices. It includes thematic areas of methods, reporting, reproducibility, evaluation, and incentives (how to do, report, verify, correct, and reward science). Much work is already done in this growing field, but efforts to-date are fragmented. We provide a map of ongoing efforts and discuss plans for connecting the multiple meta-research efforts across science worldwide.

  1. Practical method to obtain a lower bound to the three-tangle

    NASA Astrophysics Data System (ADS)

    Eltschka, Christopher; Siewert, Jens

    2014-02-01

    The quantitative assessment of the entanglement in multipartite quantum states is, apart from its fundamental importance, a practical problem. Recently there has been significant progress in developing new methods to determine certain entanglement measures. In particular, there is a method—in principle, analytical—to compute a certified lower bound for the three-tangle. The purpose of this work is to provide a manual for the implementation of this approach and to explicitly discuss several analytically solvable cases in order to gauge the numerical tools. Moreover, we derive a simple analytical bound for the mixed-state three-tangle.

  2. Recommendations and best practices for reference standards and reagents used in bioanalytical method validation.

    PubMed

    Bower, Joseph F; McClung, Jennifer B; Watson, Carl; Osumi, Takahiko; Pastre, Kátia

    2014-03-01

    The continued globalization of pharmaceutics has increased the demand for companies to know and understand the regulations that exist across the globe. One hurdle facing pharmaceutical and biotechnology companies developing new drug candidates is interpreting the current regulatory guidance documents and industry publications associated with bioanalytical method validation (BMV) from each of the different agencies throughout the world. The objective of this commentary is to provide our opinions on the best practices for reference standards and key reagents, such as metabolites and internal standards used in the support of regulated bioanalysis based on a review of current regulatory guidance documents and industry white papers for BMV.

  3. Meta-research: Evaluation and Improvement of Research Methods and Practices

    PubMed Central

    Ioannidis, John P. A.; Fanelli, Daniele; Dunne, Debbie Drake; Goodman, Steven N.

    2015-01-01

    As the scientific enterprise has grown in size and diversity, we need empirical evidence on the research process to test and apply interventions that make it more efficient and its results more reliable. Meta-research is an evolving scientific discipline that aims to evaluate and improve research practices. It includes thematic areas of methods, reporting, reproducibility, evaluation, and incentives (how to do, report, verify, correct, and reward science). Much work is already done in this growing field, but efforts to-date are fragmented. We provide a map of ongoing efforts and discuss plans for connecting the multiple meta-research efforts across science worldwide. PMID:26431313

  4. Three-dimensional Stress Analysis Using the Boundary Element Method

    NASA Technical Reports Server (NTRS)

    Wilson, R. B.; Banerjee, P. K.

    1984-01-01

    The boundary element method is to be extended (as part of the NASA Inelastic Analysis Methods program) to the three-dimensional stress analysis of gas turbine engine hot section components. The analytical basis of the method (as developed in elasticity) is outlined, its numerical implementation is summarized, and the approaches to be followed in extending the method to include inelastic material response indicated.

  5. Death: a concept analysis and application to practice.

    PubMed

    Gelling, L H

    1996-01-01

    Death is a commonly used concept but is surrounded by much mystery. The concept of death is examined using the Walker and Avant (1995) framework for concept analysis. The use of the concept death is considered in the intensive care unit. In the intensive care unit a conflict often exists between the curing culture and the inevitability of death.

  6. Strategic planning for public health practice using macroenvironmental analysis.

    PubMed Central

    Ginter, P M; Duncan, W J; Capper, S A

    1991-01-01

    Macroenvironmental analysis is the initial stage in comprehensive strategic planning. The authors examine the benefits of this type of analysis when applied to public health organizations and present a series of questions that should be answered prior to committing resources to scanning, monitoring, forecasting, and assessing components of the macroenvironment. Using illustrations from the public and private sectors, each question is examined with reference to specific challenges facing public health. Benefits are derived both from the process and the outcome of macroenvironmental analysis. Not only are data acquired that assist public health professionals to make decisions, but the analytical process required assures a better understanding of potential external threats and opportunities as well as an organization's strengths and weaknesses. Although differences exist among private and public as well as profit and not-for-profit organizations, macroenvironmental analysis is seen as more essential to the public and not-for-profit sectors than the private and profit sectors. This conclusion results from the extreme dependency of those areas on external environmental forces that cannot be significantly influenced or controlled by public health decision makers. PMID:1902305

  7. Newborn Hearing Screening: An Analysis of Current Practices

    ERIC Educational Resources Information Center

    Houston, K. Todd; Bradham, Tamala S.; Munoz, Karen F.; Guignard, Gayla Hutsell

    2011-01-01

    State coordinators of early hearing detection and intervention (EHDI) programs completed a strengths, weaknesses, opportunities, and threats, or SWOT, analysis that consisted of 12 evaluative areas of EHDI programs. For the newborn hearing screening area, a total of 293 items were listed by 49 EHDI coordinators, and themes were identified within…

  8. An Analysis of Ethical Considerations in Programme Design Practice

    ERIC Educational Resources Information Center

    Govers, Elly

    2014-01-01

    Ethical considerations are inherent to programme design decision-making, but not normally explicit. Nonetheless, they influence whose interests are served in a programme and who benefits from it. This paper presents an analysis of ethical considerations made by programme design practitioners in the context of a polytechnic in Aotearoa/New Zealand.…

  9. The Analysis of Athletic Performance: Some Practical and Philosophical Considerations

    ERIC Educational Resources Information Center

    Nelson, Lee J.; Groom, Ryan

    2012-01-01

    This article presents a hypothetical dialogue between a notational analyst (NA) recently schooled in the positivistic assessment of athletic performance, an "old-school" traditional coach (TC) who favours subjective analysis, and a pragmatic educator (PE). The conversation opens with NA and TC debating the respective value of quantitative and…

  10. Suspension, Race, and Disability: Analysis of Statewide Practices and Reporting

    ERIC Educational Resources Information Center

    Krezmien, Michael P.; Leone, Peter E.; Achilles, Georgianna M.

    2006-01-01

    This analysis of statewide suspension data from 1995 to 2003 in Maryland investigated disproportionate suspensions of minority students and students with disabilities. We found substantial increases in over-all rates of suspensions from 1995 to 2003, as well as disproportionate rates of suspensions for African American students, American Indian…

  11. Methods for analysis of fluoroquinolones in biological fluids

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Methods for analysis of 10 selected fluoroquinolone antibiotics in biological fluids are reviewed. Approaches for sample preparation, detection methods, limits of detection and quantitation and recovery information are provided for both single analyte and multi-analyte fluoroquinolone methods....

  12. A survey of castration methods and associated livestock management practices performed by bovine veterinarians in the United States

    PubMed Central

    2010-01-01

    Background Castration of male calves destined for beef production is a common management practice performed in the United States amounting to approximately 15 million procedures per year. Societal concern about the moral and ethical treatment of animals is increasing. Therefore, production agriculture is faced with the challenge of formulating animal welfare policies relating to routine management practices such as castration. To enable the livestock industry to effectively respond to these challenges there is a need for more data on management practices that are commonly used in cattle production systems. The objective of this survey was to describe castration methods, adverse events and husbandry procedures performed by U.S. veterinarians at the time of castration. Invitations to participate in the survey were sent to email addresses of 1,669 members of the American Association of Bovine Practitioners and 303 members of the Academy of Veterinary Consultants. Results After partially completed surveys and missing data were omitted, 189 responses were included in the analysis. Surgical castration with a scalpel followed by testicular removal by twisting (calves <90 kg) or an emasculator (calves >90 kg) was the most common method of castration used. The potential risk of injury to the operator, size of the calf, handling facilities and experience with the technique were the most important considerations used to determine the method of castration used. Swelling, stiffness and increased lying time were the most prevalent adverse events observed following castration. One in five practitioners report using an analgesic or local anesthetic at the time of castration. Approximately 90% of respondents indicated that they vaccinate and dehorn calves at the time of castration. Over half the respondents use disinfectants, prophylactic antimicrobials and tetanus toxoid to reduce complications following castration. Conclusions The results of this survey describe current methods

  13. Adaptation of Cost Analysis Studies in Practice Guidelines.

    PubMed

    Zervou, Fainareti N; Zacharioudakis, Ioannis M; Pliakos, Elina Eleftheria; Grigoras, Christos A; Ziakas, Panayiotis D; Mylonakis, Eleftherios

    2015-12-01

    Clinical guidelines play a central role in day-to-day practice. We assessed the degree of incorporation of cost analyses to guidelines and identified modifiable characteristics that could affect the level of incorporation.We selected the 100 most cited guidelines listed on the National Guideline Clearinghouse (http://www.guideline.gov) and determined the number of guidelines that used cost analyses in their reasoning and the overall percentage of incorporation of relevant cost analyses available in PubMed. Differences between medical specialties were also studied. Then, we performed a case-control study using incorporated and not incorporated cost analyses after 1:1 matching by study subject and compared them by the Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement requirements and other criteria.We found that 57% of guidelines do not use any cost justification. Guidelines incorporate a weighted average of 6.0% (95% confidence interval [CI] 4.3-7.9) among 3396 available cost analyses, with cardiology and infectious diseases guidelines incorporating 10.8% (95% CI 5.3-18.1) and 9.9% (95% CI 3.9- 18.2), respectively, and hematology/oncology and urology guidelines incorporating 4.5% (95% CI 1.6-8.6) and 1.6% (95% CI 0.4-3.5), respectively. Based on the CHEERS requirements, the mean number of items reported by the 148 incorporated cost analyses was 18.6 (SD = 3.7), a small but significant difference over controls (17.8 items; P = 0.02). Included analyses were also more likely to directly relate cost reductions to healthcare outcomes (92.6% vs 81.1%, P = 0.004) and declare the funding source (72.3% vs 53.4%, P < 0.001), while similar number of cases and controls reported a noncommercial funding source (71% vs 72.7%; P = 0.8).Guidelines remain an underused mechanism for the cost-effective allocation of available resources and a minority of practice guidelines incorporates cost analyses utilizing only 6% of the available

  14. Screening Workers: An Examination and Analysis of Practice and Public Policy.

    ERIC Educational Resources Information Center

    Greenfield, Patricia A.; And Others

    1989-01-01

    Discusses methods of screening job applicants and issues raised by screening procedures.. Includes legal ramifications, current practices in Britain and the United States, future directions, and the employment interview. (JOW)

  15. Visceral fat estimation method by bioelectrical impedance analysis and causal analysis

    NASA Astrophysics Data System (ADS)

    Nakajima, Hiroshi; Tasaki, Hiroshi; Tsuchiya, Naoki; Hamaguchi, Takehiro; Shiga, Toshikazu

    2011-06-01

    It has been clarified that abdominal visceral fat accumulation is closely associated to the lifestyle disease and metabolic syndrome. The gold standard in medical fields is visceral fat area measured by an X-ray computer tomography (CT) scan or magnetic resonance imaging. However, their measurements are high invasive and high cost; especially a CT scan causes X-ray exposure. They are the reasons why medical fields need an instrument for viscera fat measurement with low invasive, ease of use, and low cost. The article proposes a simple and practical method of visceral fat estimation by employing bioelectrical impedance analysis and causal analysis. In the method, abdominal shape and dual impedances of abdominal surface and body total are measured to estimate a visceral fat area based on the cause-effect structure. The structure is designed according to the nature of abdominal body composition to be fine-tuned by statistical analysis. The experiments were conducted to investigate the proposed model. 180 subjects were hired to be measured by both a CT scan and the proposed method. The acquired model explained the measurement principle well and the correlation coefficient is 0.88 with the CT scan measurements.

  16. Preventing childhood obesity during infancy in UK primary care: a mixed-methods study of HCPs' knowledge, beliefs and practice

    PubMed Central

    2011-01-01

    Background There is a strong rationale for intervening in early childhood to prevent obesity. Over a quarter of infants gain weight more rapidly than desirable during the first six months of life putting them at greater risk of obesity in childhood. However, little is known about UK healthcare professionals' (HCPs) approach to primary prevention. This study explored obesity-related knowledge of UK HCPs and the beliefs and current practice of general practitioners (GPs) and practice nurses in relation to identifying infants at risk of developing childhood obesity. Method Survey of UK HCPs (GPs, practice nurses, health visitors, nursery, community and children's nurses). HCPs (n = 116) rated their confidence in providing infant feeding advice and completed the Obesity Risk Knowledge Scale (ORK-10). Semi-structured interviews with a sub-set of 12 GPs and 6 practice nurses were audio recorded, taped and transcribed verbatim. Thematic analysis was applied using an interpretative, inductive approach. Results GPs were less confident about giving advice about infant feeding than health visitors (p = 0.001) and nursery nurses (p = 0.009) but more knowledgeable about the health risks of obesity (p < 0.001) than nurses (p = 0.009). HCPs who were consulted more often about feeding were less knowledgeable about the risks associated with obesity (r = -0.34, n = 114, p < 0.001). There was no relationship between HCPs' ratings of confidence in their advice and their knowledge of the obesity risk. Six main themes emerged from the interviews: 1) Attribution of childhood obesity to family environment, 2) Infant feeding advice as the health visitor's role, 3) Professional reliance on anecdotal or experiential knowledge about infant feeding, 4) Difficulties with recognition of, or lack of concern for, infants "at risk" of becoming obese, 5) Prioritising relationship with parent over best practice in infant feeding and 6) Lack of shared understanding for dealing with early years

  17. Analysis of post audits for Gulf of Mexico completions leads to continuous improvement in completion practices

    SciTech Connect

    Pashen, M.A.; McLeod, H.O. Jr.

    1996-12-31

    Final production rate alone is not an adequate measure of the success of a well completion. Rather, we must estimate the {open_quotes}potential{close_quotes} of a reservoir and judge the ultimate success of a completion on how close we come to achieving this potential. Specific productivity indexes (SPI`s - BFPD/(PSI*FT)), specific injectivity indexes SII`s - (BFPD/(PSI*FT)), and completion efficiencies (CE`s -percent of Darcy radial flow) can be calculated at various times throughout a well completion. Analysis of these data quantifies the efficiency of the completion after each individual completion operation, allowing a determination of the effects of each completion practice to be made. In addition to completion efficiency data, a comparison of gravel placement volumes behind casing helps quantify optimum gravel packing procedures. Twenty-two Gulf of Mexico completions have been analyzed using this technique. This paper will detail the results of this analysis, in particular the productivity effects of various methods of underbalanced perforating, gravel packing, and well control. Items of discussion include: the effects of underbalanced perforating on well performance, the effects of flowback after perforating on perforation tunnel cleaning, productivity impacts of various types of well control methods following perforating and gravel packing, and comparisons of gravel pack design parameters and gravel placement behind casing.

  18. Why and How Do Nursing Homes Implement Culture Change Practices? Insights from Qualitative Interviews in a Mixed Methods Study

    PubMed Central

    Shield, Renée R.; Looze, Jessica; Tyler, Denise; Lepore, Michael; Miller, Susan C.

    2015-01-01

    Objective To understand the process of instituting culture change (CC) practices in nursing homes (NHs). Methods NH Directors of Nursing (DONs) and Administrators (NHAs) at 4,149 United States NHs were surveyed about CC practices. Follow-up interviews with 64 NHAs were conducted and analyzed by a multidisciplinary team which reconciled interpretations recorded in an audit trail. Results The themes include: 1) Reasons for implementing CC practices vary; 2) NH approaches to implementing CC practices are diverse; 3) NHs consider resident mix in deciding to implement practices; 4) NHAs note benefits and few implementation costs of implementing CC practices; 5) Implementation of changes is challenging and strategies for change are tailored to the challenges encountered; 6) Education and communication efforts are vital ways to institute change; and 7) NHA and other staff leadership is key to implementing changes. Discussion Diverse strategies and leadership skills appear to help NHs implement reform practices, including CC innovations. PMID:24652888

  19. Integration of Formal Job Hazard Analysis & ALARA Work Practice

    SciTech Connect

    NELSEN, D.P.

    2002-09-01

    ALARA work practices have traditionally centered on reducing radiological exposure and controlling contamination. As such, ALARA policies and procedures are not well suited to a wide range of chemical and human health issues. Assessing relative risk, identifying appropriate engineering/administrative controls and selecting proper Personal Protective Equipment (PPE) for non nuclear work activities extends beyond the limitations of traditional ALARA programs. Forging a comprehensive safety management program in today's (2002) work environment requires a disciplined dialog between health and safety professionals (e.g. safety, engineering, environmental, quality assurance, industrial hygiene, ALARA, etc.) and personnel working in the field. Integrating organizational priorities, maintaining effective pre-planning of work and supporting a team-based approach to safety management represents today's hallmark of safety excellence. Relying on the mandates of any single safety program does not provide industrial hygiene with the tools necessary to implement an integrated safety program. The establishment of tools and processes capable of sustaining a comprehensive safety program represents a key responsibility of industrial hygiene. Fluor Hanford has built integrated safety management around three programmatic attributes: (1) Integration of radiological, chemical and ergonomic issues under a single program. (2) Continuous improvement in routine communications among work planning/scheduling, job execution and management. (3) Rapid response to changing work conditions, formalized work planning and integrated worker involvement.

  20. Methods for Mediation Analysis with Missing Data

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Wang, Lijuan

    2013-01-01

    Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum…

  1. A situated practice of ethics for participatory visual and digital methods in public health research and practice: a focus on digital storytelling.

    PubMed

    Gubrium, Aline C; Hill, Amy L; Flicker, Sarah

    2014-09-01

    This article explores ethical considerations related to participatory visual and digital methods for public health research and practice, through the lens of an approach known as "digital storytelling." We begin by briefly describing the digital storytelling process and its applications to public health research and practice. Next, we explore 6 common challenges: fuzzy boundaries, recruitment and consent to participate, power of shaping, representation and harm, confidentiality, and release of materials. We discuss their complexities and offer some considerations for ethical practice. We hope this article serves as a catalyst for expanded dialogue about the need for high standards of integrity and a situated practice of ethics wherein researchers and practitioners reflexively consider ethical decision-making as part of the ongoing work of public health.

  2. A Situated Practice of Ethics for Participatory Visual and Digital Methods in Public Health Research and Practice: A Focus on Digital Storytelling

    PubMed Central

    Hill, Amy L.; Flicker, Sarah

    2014-01-01

    This article explores ethical considerations related to participatory visual and digital methods for public health research and practice, through the lens of an approach known as “digital storytelling.” We begin by briefly describing the digital storytelling process and its applications to public health research and practice. Next, we explore 6 common challenges: fuzzy boundaries, recruitment and consent to participate, power of shaping, representation and harm, confidentiality, and release of materials. We discuss their complexities and offer some considerations for ethical practice. We hope this article serves as a catalyst for expanded dialogue about the need for high standards of integrity and a situated practice of ethics wherein researchers and practitioners reflexively consider ethical decision-making as part of the ongoing work of public health. PMID:23948015

  3. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  4. A method for obtaining practical flutter-suppression control laws using results of optimal control theory

    NASA Technical Reports Server (NTRS)

    Newson, J. R.

    1979-01-01

    The results of optimal control theory are used to synthesize a feedback filter. The feedback filter is used to force the output of the filtered frequency response to match that of a desired optimal frequency response over a finite frequency range. This matching is accomplished by employing a nonlinear programing algorithm to search for the coefficients of the feedback filter that minimize the error between the optimal frequency response and the filtered frequency response. The method is applied to the synthesis of an active flutter-suppression control law for an aeroelastic wind-tunnel model. It is shown that the resulting control law suppresses flutter over a wide range of subsonic Mach numbers. This is a promising method for synthesizing practical control laws using the results of optimal control theory.

  5. Practical Integration-Free Episomal Methods for Generating Human Induced Pluripotent Stem Cells.

    PubMed

    Kime, Cody; Rand, Tim A; Ivey, Kathryn N; Srivastava, Deepak; Yamanaka, Shinya; Tomoda, Kiichiro

    2015-01-01

    The advent of induced pluripotent stem (iPS) cell technology has revolutionized biomedicine and basic research by yielding cells with embryonic stem (ES) cell-like properties. The use of iPS-derived cells for cell-based therapies and modeling of human disease holds great potential. While the initial description of iPS cells involved overexpression of four transcription factors via viral vectors that integrated within genomic DNA, advances in recent years by our group and others have led to safer and higher quality iPS cells with greater efficiency. Here, we describe commonly practiced methods for non-integrating induced pluripotent stem cell generation using nucleofection of episomal reprogramming plasmids. These methods are adapted from recent studies that demonstrate increased hiPS cell reprogramming efficacy with the application of three powerful episomal hiPS cell reprogramming factor vectors and the inclusion of an accessory vector expressing EBNA1.

  6. Exploration of Methods Used by Pharmacy Professional Programs to Contract with Experiential Practice Sites

    PubMed Central

    Garavalia, Linda; Gubbins, Paul O.; Ruehter, Valerie

    2016-01-01

    Objective. To explore methods used by pharmacy programs to attract and sustain relationships with preceptors and experiential practice sites. Methods. Interviews with eight focus groups of pharmacy experiential education experts (n=35) were conducted at two national pharmacy meetings. A semi-structured interview guide was used. Focus group interviews were recorded, transcribed verbatim, and categorically coded independently by two researchers. Codes were compared, consensus was reached through discussion, and two experiential education experts assisted with interpretation of the coded data. Results. Six themes emerged consistently across focus groups: a perceived increase in preceptor compensation, intended vs actual use of payments by sites, concern over renegotiation of established compensation, costs and benefits of experiential students, territorialism, and motives. Conclusion. Fostering a culture of collaboration may counteract potentially competitive strategies to gain sites. Participants shared a common interest in providing high-quality experiential learning where sites and preceptors participated for altruistic reasons, rather than compensation. PMID:27073279

  7. Multiscale Methods for Nuclear Reactor Analysis

    NASA Astrophysics Data System (ADS)

    Collins, Benjamin S.

    The ability to accurately predict local pin powers in nuclear reactors is necessary to understand the mechanisms that cause fuel pin failure during steady state and transient operation. In the research presented here, methods are developed to improve the local solution using high order methods with boundary conditions from a low order global solution. Several different core configurations were tested to determine the improvement in the local pin powers compared to the standard techniques, that use diffusion theory and pin power reconstruction (PPR). Two different multiscale methods were developed and analyzed; the post-refinement multiscale method and the embedded multiscale method. The post-refinement multiscale methods use the global solution to determine boundary conditions for the local solution. The local solution is solved using either a fixed boundary source or an albedo boundary condition; this solution is "post-refinement" and thus has no impact on the global solution. The embedded multiscale method allows the local solver to change the global solution to provide an improved global and local solution. The post-refinement multiscale method is assessed using three core designs. When the local solution has more energy groups, the fixed source method has some difficulties near the interface: however the albedo method works well for all cases. In order to remedy the issue with boundary condition errors for the fixed source method, a buffer region is used to act as a filter, which decreases the sensitivity of the solution to the boundary condition. Both the albedo and fixed source methods benefit from the use of a buffer region. Unlike the post-refinement method, the embedded multiscale method alters the global solution. The ability to change the global solution allows for refinement in areas where the errors in the few group nodal diffusion are typically large. The embedded method is shown to improve the global solution when it is applied to a MOX/LEU assembly

  8. [Methods and applications of population viability analysis (PVA): a review].

    PubMed

    Tian, Yu; Wu, Jian-Guo; Kou, Xiao-Jun; Wang, Tian-Ming; Smith, Andrew T; Ge, Jian-Ping

    2011-01-01

    With the accelerating human consumption of natural resources, the problems associated with endangered species caused by habitat loss and fragmentation have become greater and more urgent than ever. Conceptually associated with the theories of island biogeography, population viability analysis (PVA) has been one of the most important approaches in studying and protecting endangered species, and this methodology has occupied a central place in conservation biology and ecology in the past several decades. PVA has been widely used and proven effective in many cases, but its predictive ability and accuracy are still in question. Also, its application needs expand. To overcome some of the problems, we believe that PVA needs to incorporate some principles and methods from other fields, particularly landscape ecology and sustainability science. Integrating landscape pattern and socioeconomic factors into PVA will make the approach theoretically more comprehensive and practically more useful. Here, we reviewed the history, basic conception, research methods, and modeling applications and their accuracies of PVA, and proposed the perspective in this field. PMID:21548317

  9. Practice Makes Perfect: Improving Students' Skills in Understanding and Avoiding Plagiarism with a Themed Methods Course

    ERIC Educational Resources Information Center

    Estow, Sarah; Lawrence, Eva K.; Adams, Kathrynn A.

    2011-01-01

    To address the issue of plagiarism, students in two undergraduate Research Methods and Analysis courses conducted, analyzed, and wrote up original research on the topic of plagiarism. Students in an otherwise identical course completed the same assignments but examined a different research topic. At the start and end of the semester, all students…

  10. [The Application of the Fault Tree Analysis Method in Medical Equipment Maintenance].

    PubMed

    Liu, Hongbin

    2015-11-01

    In this paper, the traditional fault tree analysis method is presented, detailed instructions for its application characteristics in medical instrument maintenance is made. It is made significant changes when the traditional fault tree analysis method is introduced into the medical instrument maintenance: gave up the logic symbolic, logic analysis and calculation, gave up its complicated programs, and only keep its image and practical fault tree diagram, and the fault tree diagram there are also differences: the fault tree is no longer a logical tree but the thinking tree in troubleshooting, the definition of the fault tree's nodes is different, the composition of the fault tree's branches is also different.

  11. Reconciling Data from Different Sources: Practical Realities of Using Mixed Methods to Identify Effective High School Practices

    ERIC Educational Resources Information Center

    Smith, Thomas M.; Cannata, Marisa; Haynes, Katherine Taylor

    2016-01-01

    Background/Context: Mixed methods research conveys multiple advantages to the study of complex phenomena and large organizations or systems. The benefits are derived from drawing on the strengths of qualitative methods to answer questions about how and why a phenomenon occurs and those of quantitative methods to examine how often a phenomenon…

  12. Reforming High School Science for Low-Performing Students Using Inquiry Methods and Communities of Practice

    NASA Astrophysics Data System (ADS)

    Bolden, Marsha Gail

    Some schools fall short of the high demand to increase science scores on state exams because low-performing students enter high school unprepared for high school science. Low-performing students are not successful in high school for many reasons. However, using inquiry methods have improved students' understanding of science concepts. The purpose of this qualitative research study was to investigate the teachers' lived experiences with using inquiry methods to motivate low-performing high school science students in an inquiry-based program called Xtreem Science. Fifteen teachers were selected from the Xtreem Science program, a program designed to assist teachers in motivating struggling science students. The research questions involved understanding (a) teachers' experiences in using inquiry methods, (b) challenges teachers face in using inquiry methods, and (c) how teachers describe student's response to inquiry methods. Strategy of data collection and analysis included capturing and understanding the teachers' feelings, perceptions, and attitudes in their lived experience of teaching using inquiry method and their experience in motivating struggling students. Analysis of interview responses revealed teachers had some good experiences with inquiry and expressed that inquiry impacted their teaching style and approach to topics, and students felt that using inquiry methods impacted student learning for the better. Inquiry gave low-performing students opportunities to catch up and learn information that moved them to the next level of science courses. Implications for positive social change include providing teachers and school district leaders with information to help improve performance of the low performing science students.

  13. Practical Estimates of Field-Saturated Hydraulic Conductivity of Bedrock Outcrops using a Modified Bottomless Bucket Method

    NASA Astrophysics Data System (ADS)

    Mirus, B. B.; Perkins, K. S.

    2012-12-01

    The bottomless bucket (BB) approach (Nimmo et al., VZJ, 2009) is a cost-effective method for rapidly characterizing field-saturated hydraulic conductivity Kfs of soils and alluvial deposits. This practical approach is of particular value for quantifying infiltration rates in remote areas with limited accessibility. A similar approach for bedrock outcrops is also of great value for improving quantitative understanding of infiltration and recharge in rugged terrain. We develop a simple modification to the BB method for application to bedrock outcrops, which uses a non-toxic, quick-drying silicone gel to seal the BB to the bedrock. These modifications to the field method require only minor changes to the analytical solution for calculating Kfs on soils. We investigate the reproducibility of the method with laboratory experiments on a previously studied calcarenite rock and conduct a sensitivity analysis to quantify uncertainty in our predictions. We apply the BB method on both bedrock and soil for sites on Pahute Mesa, which is located in a remote area of the Nevada National Security Site. The bedrock BB tests may require monitoring over several hours to days, depending on infiltration rates, which necessitates a cover to prevent evaporative losses. Our field and laboratory results compare well to Kfs values inferred from independent reports, which suggests the modified BB method can provide useful estimates and facilitate simple hypothesis testing. The ease with which the bedrock BB method can be deployed should facilitate more rapid in-situ data collection than is possible with alternative methods for quantitative characterization of infiltration into bedrock. Typical deployment of bedrock bottomless buckets (BBB's) on an outcrop of volcanic tuff before the application of water.

  14. Practical estimates of field-saturated hydraulic conductivity of bedrock outcrops using a modified bottomless bucket method

    USGS Publications Warehouse

    Mirus, Benjamin B.; Perkins, Kim S.

    2012-01-01

    The bottomless bucket (BB) approach (Nimmo et al., 2009a) is a cost-effective method for rapidly characterizing field-saturated hydraulic conductivity Kfs of soils and alluvial deposits. This practical approach is of particular value for quantifying infiltration rates in remote areas with limited accessibility. A similar approach for bedrock outcrops is also of great value for improving quantitative understanding of infiltration and recharge in rugged terrain. We develop a simple modification to the BB method for application to bedrock outcrops, which uses a non-toxic, quick-drying silicone gel to seal the BB to the bedrock. These modifications to the field method require only minor changes to the analytical solution for calculating Kfs on soils. We investigate the reproducibility of the method with laboratory experiments on a previously studied calcarenite rock and conduct a sensitivity analysis to quantify uncertainty in our predictions. We apply the BB method on both bedrock and soil for sites on Pahute Mesa, which is located in a remote area of the Nevada National Security Site. The bedrock BB tests may require monitoring over several hours to days, depending on infiltration rates, which necessitates a cover to prevent evaporative losses. Our field and laboratory results compare well to Kfs values inferred from independent reports, which suggests the modified BB method can provide useful estimates and facilitate simple hypothesis testing. The ease with which the bedrock BB method can be deployed should facilitate more rapid in-situ data collection than is possible with alternative methods for quantitative characterization of infiltration into bedrock.

  15. A catalog of automated analysis methods for enterprise models.

    PubMed

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.

  16. Practical notes on local data-worth analysis

    NASA Astrophysics Data System (ADS)

    Finsterle, Stefan

    2015-12-01

    These notes discuss the usefulness, limitations, and potential pitfalls of using sensitivity indices as a means to evaluate data worth and to guide the formulation and solution of inverse problems. A sensitivity analysis examines changes in model output variables with respect to changes in model input parameters. It appears straightforward to use this information to select influential parameters that should be subjected to estimation by inverse modeling and to identify the observations that contain information about these parameters and thus may be useful as calibration points. However, the results of such a sensitivity analysis do not account for parameter correlations and redundancies in observations and may not properly separate between calibration and prediction targets if used as criteria that guide inverse modeling; they may thus yield misleading recommendations about parameter identifiability and data worth. These issues (and some remedies) are discussed using an illustrative example, in which we examine the value of data sets potentially used for the calibration of a geothermal reservoir model. These notes highlight the importance of carefully formulating the objectives of a simulation study, which controls the setup of the inverse problem and related data needs.

  17. Practical Application of Parallel Coordinates for Climate Model Analysis

    SciTech Connect

    Steed, Chad A; Shipman, Galen M; Thornton, Peter E; Ricciuto, Daniel M; Erickson III, David J; Branstetter, Marcia L

    2012-01-01

    The determination of relationships between climate variables and the identification of the most significant associations between them in various geographic regions is an important aspect of climate model evaluation. The EDEN visual analytics toolkit has been developed to aid such analysis by facilitating the assessment of multiple variables with respect to the amount of variability that can be attributed to specific other variables. EDEN harnesses the parallel coordinates visualization technique and is augmented with graphical indicators of key descriptive statistics. A case study is presented in which the focus on the Harvard Forest site (42.5378N Lat, 72.1715W Lon) and the Community Land Model Version 4 (CLM4) is evaluated. It is shown that model variables such as land water runoff are more sensitive to a particular set of environmental variables than a suite of other inputs in the 88 variable analysis conducted. The approach presented here allows climate-domain scientists to focus on the most important variables in the model evaluations.

  18. Musical Practices and Methods in Music Lessons: A Comparative Study of Estonian and Finnish General Music Education

    ERIC Educational Resources Information Center

    Sepp, Anu; Ruokonen, Inkeri; Ruismäki, Heikki

    2015-01-01

    This article reveals the results of a comparative study of Estonian and Finnish general music education. The aim was to find out what music teaching practices and approaches/methods were mostly used, what music education perspectives supported those practices. The data were collected using questionnaires and the results of 107 Estonian and 50…

  19. Image, measure, figure: a critical discourse analysis of nursing practices that develop children.

    PubMed

    Einboden, Rochelle; Rudge, Trudy; Varcoe, Colleen

    2013-07-01

    Motivated by discourses that link early child development and health, nurses engage in seemingly benign surveillance of children. These practices are based on knowledge claims and technologies of developmental science, which remain anchored in assumptions of the child body as an incomplete form with a universal developmental trajectory and inherent potentiality. This paper engages in a critical discursive analysis, drawing on Donna Haraway's conceptualizations of technoscience and figuration. Using a contemporary developmental screening tool from nursing practice, this analysis traces the effects of this tool through production, transformation, distribution, and consumption. It reveals how the techniques of imaging, abstraction, and measurement collide to fix the open, transformative child body in a figuration of the developing child. This analysis also demonstrates how technobiopower infuses nurses' understandings of children and structures developmentally appropriate expectations for children, parents, and nurses. Furthermore, it describes how practices that claim to facilitate healthy child development may inversely deprive children of agency and foster the production of normal or ideal children. An alternative ontological perspective is offered as a challenge to the individualism of developmental models and other dominant ideologies of development, as well as practices associated with these ideologies. In summary, this analysis argues that nurses must pay closer attention to how technobiopower infuses practices that monitor and promote child development. Fostering a critical understanding of the harmful implications of these practices is warranted and offers the space to conceive of human development in alternate and exciting ways. PMID:23745662

  20. Image, measure, figure: a critical discourse analysis of nursing practices that develop children.

    PubMed

    Einboden, Rochelle; Rudge, Trudy; Varcoe, Colleen

    2013-07-01

    Motivated by discourses that link early child development and health, nurses engage in seemingly benign surveillance of children. These practices are based on knowledge claims and technologies of developmental science, which remain anchored in assumptions of the child body as an incomplete form with a universal developmental trajectory and inherent potentiality. This paper engages in a critical discursive analysis, drawing on Donna Haraway's conceptualizations of technoscience and figuration. Using a contemporary developmental screening tool from nursing practice, this analysis traces the effects of this tool through production, transformation, distribution, and consumption. It reveals how the techniques of imaging, abstraction, and measurement collide to fix the open, transformative child body in a figuration of the developing child. This analysis also demonstrates how technobiopower infuses nurses' understandings of children and structures developmentally appropriate expectations for children, parents, and nurses. Furthermore, it describes how practices that claim to facilitate healthy child development may inversely deprive children of agency and foster the production of normal or ideal children. An alternative ontological perspective is offered as a challenge to the individualism of developmental models and other dominant ideologies of development, as well as practices associated with these ideologies. In summary, this analysis argues that nurses must pay closer attention to how technobiopower infuses practices that monitor and promote child development. Fostering a critical understanding of the harmful implications of these practices is warranted and offers the space to conceive of human development in alternate and exciting ways.