Science.gov

Sample records for practical analysis method

  1. A Practical Method of Policy Analysis by Simulating Policy Options

    ERIC Educational Resources Information Center

    Phelps, James L.

    2011-01-01

    This article focuses on a method of policy analysis that has evolved from the previous articles in this issue. The first section, "Toward a Theory of Educational Production," identifies concepts from science and achievement production to be incorporated into this policy analysis method. Building on Kuhn's (1970) discussion regarding paradigms, the…

  2. Practical Use of Computationally Frugal Model Analysis Methods

    DOE PAGESBeta

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less

  3. Practical Use of Computationally Frugal Model Analysis Methods

    SciTech Connect

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugal methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts

  4. Practical Use of Computationally Frugal Model Analysis Methods.

    PubMed

    Hill, Mary C; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2016-03-01

    Three challenges compromise the utility of mathematical models of groundwater and other environmental systems: (1) a dizzying array of model analysis methods and metrics make it difficult to compare evaluations of model adequacy, sensitivity, and uncertainty; (2) the high computational demands of many popular model analysis methods (requiring 1000's, 10,000 s, or more model runs) make them difficult to apply to complex models; and (3) many models are plagued by unrealistic nonlinearities arising from the numerical model formulation and implementation. This study proposes a strategy to address these challenges through a careful combination of model analysis and implementation methods. In this strategy, computationally frugal model analysis methods (often requiring a few dozen parallelizable model runs) play a major role, and computationally demanding methods are used for problems where (relatively) inexpensive diagnostics suggest the frugal methods are unreliable. We also argue in favor of detecting and, where possible, eliminating unrealistic model nonlinearities-this increases the realism of the model itself and facilitates the application of frugal methods. Literature examples are used to demonstrate the use of frugal methods and associated diagnostics. We suggest that the strategy proposed in this paper would allow the environmental sciences community to achieve greater transparency and falsifiability of environmental models, and obtain greater scientific insight from ongoing and future modeling efforts. PMID:25810333

  5. A Practical Method of Policy Analysis by Estimating Effect Size

    ERIC Educational Resources Information Center

    Phelps, James L.

    2011-01-01

    The previous articles on class size and other productivity research paint a complex and confusing picture of the relationship between policy variables and student achievement. Missing is a conceptual scheme capable of combining the seemingly unrelated research and dissimilar estimates of effect size into a unified structure for policy analysis and…

  6. A Topography Analysis Incorporated Optimization Method for the Selection and Placement of Best Management Practices

    PubMed Central

    Shen, Zhenyao; Chen, Lei; Xu, Liang

    2013-01-01

    Best Management Practices (BMPs) are one of the most effective methods to control nonpoint source (NPS) pollution at a watershed scale. In this paper, the use of a topography analysis incorporated optimization method (TAIOM) was proposed, which integrates topography analysis with cost-effective optimization. The surface status, slope and the type of land use were evaluated as inputs for the optimization engine. A genetic algorithm program was coded to obtain the final optimization. The TAIOM was validated in conjunction with the Soil and Water Assessment Tool (SWAT) in the Yulin watershed in Southwestern China. The results showed that the TAIOM was more cost-effective than traditional optimization methods. The distribution of selected BMPs throughout landscapes comprising relatively flat plains and gentle slopes, suggests the need for a more operationally effective scheme, such as the TAIOM, to determine the practicability of BMPs before widespread adoption. The TAIOM developed in this study can easily be extended to other watersheds to help decision makers control NPS pollution. PMID:23349917

  7. Practical method for radioactivity distribution analysis in small-animal PET cancer studies

    PubMed Central

    Slavine, Nikolai V.; Antich, Peter P.

    2008-01-01

    We present a practical method for radioactivity distribution analysis in small-animal tumors and organs using positron emission tomography imaging with a calibrated source of known activity and size in the field of view. We reconstruct the imaged mouse together with a source under the same conditions, using an iterative method, Maximum Likelihood Expectation-Maximization with System Modeling, capable of delivering high resolution images. Corrections for the ratios of geometrical efficiencies, radioisotope decay in time and photon attenuation are included in the algorithm. We demonstrate reconstruction results for the amount of radioactivity within the scanned mouse in a sample study of osteolytic and osteoblastic bone metastasis from prostate cancer xenografts. Data acquisition was performed on the small-animal PET system which was tested with different radioactive sources, phantoms and animals to achieve high sensitivity and spatial resolution. Our method uses high resolution images to determine the volume of organ or tumor and the amount of their radioactivity, has the possibility of saving time, effort and the necessity to sacrifice animals. This method has utility for prognosis and quantitative analysis in small-animal cancer studies, and will enhance the assessment of characteristics of tumor growth, identifying metastases, and potentially determining the effectiveness of cancer treatment. The possible application for this technique could be useful for the organ radioactivity dosimetry studies. PMID:18667322

  8. Methods and practices used in incident analysis in the Finnish nuclear power industry.

    PubMed

    Suksi, Seija

    2004-07-26

    Finnish nuclear power plant operators Teollisuuden Voima Oy (TVO) and Fortum Power and Heat Oy (Fortum) was carried out by the Technical Research Centre (VTT) on request of STUK at the end of 1990s. The study aimed at providing a broad overview and suggestions for improvement of the whole organisational framework to support event investigation practices at the regulatory body and at the utilities. The main objective of the research was to evaluate the adequacy and reliability of event investigation analysis methods and practices in the Finnish nuclear power industry and based on the results to further develop them. The results and suggestions of the research are reviewed in the paper and the corrective actions implemented in event investigation and operating experience procedures both at STUK and at utilities are discussed as well. STUK has developed its own procedure for the risk-informed analysis of nuclear power plant events. The PSA based event analysis method is used to assess the safety significance and importance measures associated with the unavailability of components and systems subject to Technical Specifications. The insights from recently performed PSA based analyses are also briefly discussed in the paper. PMID:15231350

  9. Practical Methods for the Analysis of Voltage Collapse in Electric Power Systems: a Stationary Bifurcations Viewpoint.

    NASA Astrophysics Data System (ADS)

    Jean-Jumeau, Rene

    1993-03-01

    Voltage collapse (VC) is generally caused by either of two types of system disturbances: load variations and contingencies. In this thesis, we study VC resulting from load variations. This is termed static voltage collapse. This thesis deals with this type of voltage collapse in electrical power systems by using a stationary bifurcations viewpoint by associating it with the occurrence of saddle node bifurcations (SNB) in the system. Approximate models are generically used in most VC analyses. We consider the validity of these models for the study of SNB and, thus, of voltage collapse. We justify the use of saddle node bifurcation as a model for VC in power systems. In particular, we prove that this leads to definition of a model and--since load demand is used as a parameter for that model--of a mode of parameterization of that model in order to represent actual power demand variations within the power system network. Ill-conditioning of the set of nonlinear equations defining a dynamical system is a generic occurence near the SNB point. We suggest a reparameterization of the set of nonlinear equations which allows to avoid this problem. A new indicator for the proximity of voltage collapse, the voltage collapse index (VCI), is developed. A new (n + 1)-dimensional set of characteristic equations for the computation of the exact SNB point, replacing the standard (2n + 1)-dimensional one is presented for general parameter -dependent nonlinear dynamical systems. These results are then applied to electric power systems for the analysis and prediction of voltage collapse. The new methods offer the potential of faster computation and greater flexibility. For reasons of theoretical development and clarity, the preceding methodologies are developed under the assumption of the absence of constraints on the system parameters and states, and the full differentiability of the functions defining the power system model. In the latter part of this thesis, we relax these

  10. Analysis of the upper massif of the craniofacial with the radial methodpractical use

    PubMed Central

    Lepich, Tomasz; Dąbek, Józefa; Stompel, Daniel; Gielecki, Jerzy S.

    2011-01-01

    Introduction The analysis of the upper massif of the craniofacial (UMC) is widely used in many fields of science. The aim of the study was to create a high resolution computer system based on a digital information record and on vector graphics, that could enable dimension measuring and evaluation of craniofacial shape using the radial method. Material and methods The study was carried out on 184 skulls, in a good state of preservation, from the early middle ages. The examined skulls were fixed into Molisson's craniostat in the author's own modification. They were directed in space towards the Frankfurt plane and photographed in frontal norm with a digital camera. The parameters describing the plane and dimensional structure of the UMC and orbits were obtained thanks to the computer analysis of the function recordings picturing the craniofacial structures and using software combining raster graphics with vector graphics. Results It was compared mean values of both orbits separately for male and female groups. In female skulls the comparison of the left and right side did not show statistically significant differences. In male group, higher values were observed for the right side. Only the circularity index presented higher values for the left side. Conclusions Computer graphics with the software used for analysing digital pictures of UMC and orbits increase the precision of measurements as well as the calculation possibilities. Recognition of the face in the post mortem examination is crucial for those working on identification in anthropology and criminology laboratories. PMID:22291834

  11. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory?Determination of Trihalomethane Formation Potential, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Fram, Miranda S.; Bush, Noel

    2004-01-01

    An analytical method for the determination of the trihalomethane formation potential of water samples has been developed. The trihalomethane formation potential is measured by dosing samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine, and then analyzing the resulting trihalomethanes by purge and trap/gas chromatography equipped with an electron capture detector. Detailed explanations of the method and quality-control practices are provided. Method validation experiments showed that the trihalomethane formation potential varies as a function of time between sample collection and analysis, residual-free chlorine concentration, method of sample dilution, and the concentration of bromide in the sample.

  12. Animal Disease Import Risk Analysis--a Review of Current Methods and Practice.

    PubMed

    Peeler, E J; Reese, R A; Thrush, M A

    2015-10-01

    The application of risk analysis to the spread of disease with international trade in animals and their products, that is, import risk analysis (IRA), has been largely driven by the Sanitary and Phytosanitary (SPS) agreement of the World Trade Organization (WTO). The degree to which the IRA standard established by the World Organization for Animal Health (OIE), and associated guidance, meets the needs of the SPS agreement is discussed. The use of scenario trees is the core modelling approach used to represent the steps necessary for the hazard to occur. There is scope to elaborate scenario trees for commodity IRA so that the quantity of hazard at each step is assessed, which is crucial to the likelihood of establishment. The dependence between exposure and establishment suggests that they should fall within the same subcomponent. IRA undertaken for trade reasons must include an assessment of consequences to meet SPS criteria, but guidance is sparse. The integration of epidemiological and economic modelling may open a path for better methods. Matrices have been used in qualitative IRA to combine estimates of entry and exposure, and consequences with likelihood, but this approach has flaws and better methods are needed. OIE IRA standards and guidance indicate that the volume of trade should be taken into account, but offer no detail. Some published qualitative IRAs have assumed current levels and patterns of trade without specifying the volume of trade, which constrains the use of IRA to determine mitigation measures (to reduce risk to an acceptable level) and whether the principle of equivalence, fundamental to the SPS agreement, has been observed. It is questionable whether qualitative IRA can meet all the criteria set out in the SPS agreement. Nevertheless, scope exists to elaborate the current standards and guidance, so they better serve the principle of science-based decision-making. PMID:24237667

  13. Evaluating the clinical appropriateness of nurses' prescribing practice: method development and findings from an expert panel analysis

    PubMed Central

    Latter, Sue; Maben, Jill; Myall, Michelle; Young, Amanda

    2007-01-01

    Background The number of nurses independently prescribing medicines in England is rising steadily. There had been no attempt systematically to evaluate the clinical appropriateness of nurses' prescribing decisions. Aims (i) To establish a method of assessing the clinical appropriateness of nurses' prescribing decisions; (ii) to evaluate the prescribing decisions of a sample of nurses, using this method. Method A modified version of the Medication Appropriateness Index (MAI) was developed, piloted and subsequently used by seven medical prescribing experts to rate transcripts of 12 nurse prescriber consultations selected from a larger database of 118 audio‐recorded consultations collected as part of a national evaluation. Experts were also able to give written qualitative comments on each of the MAI dimensions applied to each of the consultations. Analysis Experts' ratings were analysed using descriptive statistics. Qualitative comments were subjected to a process of content analysis to identify themes within and across both MAI items and consultations. Results Experts' application of the modified MAI to transcripts of nurse prescriber consultations demonstrated validity and feasibility as a method of assessing the clinical appropriateness of nurses' prescribing decisions. In the majority of assessments made by the expert panel, nurses' prescribing decisions were rated as clinically appropriate on all nine items in the MAI. Conclusion A valid and feasible method of assessing the clinical appropriateness of nurses' prescribing practice has been developed using a modified MAI and transcripts of audio‐recorded consultations sent to a panel of prescribing experts. Prescribing nurses in this study were generally considered to be making clinically appropriate prescribing decisions. This approach to measuring prescribing appropriateness could be used as part of quality assurance in routine practice, as a method of identifying continuing professional development needs

  14. APA's Learning Objectives for Research Methods and Statistics in Practice: A Multimethod Analysis

    ERIC Educational Resources Information Center

    Tomcho, Thomas J.; Rice, Diana; Foels, Rob; Folmsbee, Leah; Vladescu, Jason; Lissman, Rachel; Matulewicz, Ryan; Bopp, Kara

    2009-01-01

    Research methods and statistics courses constitute a core undergraduate psychology requirement. We analyzed course syllabi and faculty self-reported coverage of both research methods and statistics course learning objectives to assess the concordance with APA's learning objectives (American Psychological Association, 2007). We obtained a sample of…

  15. Practical Thermal Evaluation Methods For HAC Fire Analysis In Type B Radiaoactive Material (RAM) Packages

    SciTech Connect

    Abramczyk, Glenn; Hensel, Stephen J; Gupta, Narendra K.

    2013-03-28

    Title 10 of the United States Code of Federal Regulations Part 71 for the Nuclear Regulatory Commission (10 CFR Part 71.73) requires that Type B radioactive material (RAM) packages satisfy certain Hypothetical Accident Conditions (HAC) thermal design requirements to ensure package safety during accidental fire conditions. Compliance with thermal design requirements can be met by prototype tests, analyses only or a combination of tests and analyses. Normally, it is impractical to meet all the HAC using tests only and the analytical methods are too complex due to the multi-physics non-linear nature of the fire event. Therefore, a combination of tests and thermal analyses methods using commercial heat transfer software are used to meet the necessary design requirements. The authors, along with his other colleagues at Savannah River National Laboratory in Aiken, SC, USA, have successfully used this 'tests and analyses' approach in the design and certification of several United States' DOE/NNSA certified packages, e.g. 9975, 9977, 9978, 9979, H1700, and Bulk Tritium Shipping Package (BTSP). This paper will describe these methods and it is hoped that the RAM Type B package designers and analysts can use them for their applications.

  16. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: practical guidelines for the crystalline sponge method

    SciTech Connect

    Ramadhar, Timothy R.; Zheng, Shao-Liang; Chen, Yu-Sheng

    2015-01-01

    This report describes complete practical guidelines and insights for the crystalline sponge method, which have been derived through the first use of synchrotron radiation on these systems, and includes a procedure for faster synthesis of the sponges. These guidelines will be applicable to crystal sponge data collected at synchrotrons or in-house facilities, and will allow researchers to obtain reliable high-quality data and construct chemically and physically sensible models for guest structural determination. A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine

  17. A Critical Analysis of SocINDEX and Sociological Abstracts Using an Evaluation Method for the Practicing Bibliographer

    ERIC Educational Resources Information Center

    Mellone, James T.

    2010-01-01

    This study provides a database evaluation method for the practicing bibliographer that is more than a brief review yet less than a controlled experiment. The author establishes evaluation criteria in the context of the bibliographic instruction provided to meet the research requirements of undergraduate sociology majors at Queens College, City…

  18. Insight into Evaluation Practice: A Content Analysis of Designs and Methods Used in Evaluation Studies Published in North American Evaluation-Focused Journals

    ERIC Educational Resources Information Center

    Christie, Christina A.; Fleischer, Dreolin Nesbitt

    2010-01-01

    To describe the recent practice of evaluation, specifically method and design choices, the authors performed a content analysis on 117 evaluation studies published in eight North American evaluation-focused journals for a 3-year period (2004-2006). The authors chose this time span because it follows the scientifically based research (SBR)…

  19. Qualitative Approaches to Mixed Methods Practice

    ERIC Educational Resources Information Center

    Hesse-Biber, Sharlene

    2010-01-01

    This article discusses how methodological practices can shape and limit how mixed methods is practiced and makes visible the current methodological assumptions embedded in mixed methods practice that can shut down a range of social inquiry. The article argues that there is a "methodological orthodoxy" in how mixed methods is practiced that…

  20. Future methods in pharmacy practice research.

    PubMed

    Almarsdottir, A B; Babar, Z U D

    2016-06-01

    This article describes the current and future practice of pharmacy scenario underpinning and guiding this research and then suggests future directions and strategies for such research. First, it sets the scene by discussing the key drivers which could influence the change in pharmacy practice research. These are demographics, technology and professional standards. Second, deriving from this, it seeks to predict and forecast the future shifts in use of methodologies. Third, new research areas and availability of data impacting on future methods are discussed. These include the impact of aging information technology users on healthcare, understanding and responding to cultural and social disparities, implementing multidisciplinary initiatives to improve health care, medicines optimization and predictive risk analysis, and pharmacy as business and health care institution. Finally, implications of the trends for pharmacy practice research methods are discussed. PMID:27209486

  1. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory-- Determination of Dissolved Organic Carbon in Water by High Temperature Catalytic Oxidation, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Bird, Susan M.; Fram, Miranda S.; Crepeau, Kathryn L.

    2003-01-01

    An analytical method has been developed for the determination of dissolved organic carbon concentration in water samples. This method includes the results of the tests used to validate the method and the quality-control practices used for dissolved organic carbon analysis. Prior to analysis, water samples are filtered to remove suspended particulate matter. A Shimadzu TOC-5000A Total Organic Carbon Analyzer in the nonpurgeable organic carbon mode is used to analyze the samples by high temperature catalytic oxidation. The analysis usually is completed within 48 hours of sample collection. The laboratory reporting level is 0.22 milligrams per liter.

  2. Systematic Review and Meta-Analysis of Studies Evaluating Diagnostic Test Accuracy: A Practical Review for Clinical Researchers-Part II. Statistical Methods of Meta-Analysis.

    PubMed

    Lee, Juneyoung; Kim, Kyung Won; Choi, Sang Hyun; Huh, Jimi; Park, Seong Ho

    2015-01-01

    Meta-analysis of diagnostic test accuracy studies differs from the usual meta-analysis of therapeutic/interventional studies in that, it is required to simultaneously analyze a pair of two outcome measures such as sensitivity and specificity, instead of a single outcome. Since sensitivity and specificity are generally inversely correlated and could be affected by a threshold effect, more sophisticated statistical methods are required for the meta-analysis of diagnostic test accuracy. Hierarchical models including the bivariate model and the hierarchical summary receiver operating characteristic model are increasingly being accepted as standard methods for meta-analysis of diagnostic test accuracy studies. We provide a conceptual review of statistical methods currently used and recommended for meta-analysis of diagnostic test accuracy studies. This article could serve as a methodological reference for those who perform systematic review and meta-analysis of diagnostic test accuracy studies. PMID:26576107

  3. Practical method for analysis and design of slender reinforced concrete columns subjected to biaxial bending and axial loads

    NASA Astrophysics Data System (ADS)

    Bouzid, T.; Demagh, K.

    2011-03-01

    Reinforced and concrete-encased composite columns of arbitrarily shaped cross sections subjected to biaxial bending and axial loads are commonly used in many structures. For this purpose, an iterative numerical procedure for the strength analysis and design of short and slender reinforced concrete columns with a square cross section under biaxial bending and an axial load by using an EC2 stress-strain model is presented in this paper. The computational procedure takes into account the nonlinear behavior of the materials (i.e., concrete and reinforcing bars) and includes the second - order effects due to the additional eccentricity of the applied axial load by the Moment Magnification Method. The ability of the proposed method and its formulation has been tested by comparing its results with the experimental ones reported by some authors. This comparison has shown that a good degree of agreement and accuracy between the experimental and theoretical results have been obtained. An average ratio (proposed to test) of 1.06 with a deviation of 9% is achieved.

  4. Practical state of health estimation of power batteries based on Delphi method and grey relational grade analysis

    NASA Astrophysics Data System (ADS)

    Sun, Bingxiang; Jiang, Jiuchun; Zheng, Fangdan; Zhao, Wei; Liaw, Bor Yann; Ruan, Haijun; Han, Zhiqiang; Zhang, Weige

    2015-05-01

    The state of health (SOH) estimation is very critical to battery management system to ensure the safety and reliability of EV battery operation. Here, we used a unique hybrid approach to enable complex SOH estimations. The approach hybridizes the Delphi method known for its simplicity and effectiveness in applying weighting factors for complicated decision-making and the grey relational grade analysis (GRGA) for multi-factor optimization. Six critical factors were used in the consideration for SOH estimation: peak power at 30% state-of-charge (SOC), capacity, the voltage drop at 30% SOC with a C/3 pulse, the temperature rises at the end of discharge and charge at 1C; respectively, and the open circuit voltage at the end of charge after 1-h rest. The weighting of these factors for SOH estimation was scored by the 'experts' in the Delphi method, indicating the influencing power of each factor on SOH. The parameters for these factors expressing the battery state variations are optimized by GRGA. Eight battery cells were used to illustrate the principle and methodology to estimate the SOH by this hybrid approach, and the results were compared with those based on capacity and power capability. The contrast among different SOH estimations is discussed.

  5. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: Practical guidelines for the crystalline sponge method

    DOE PAGESBeta

    Ramadhar, Timothy R.; Zheng, Shao -Liang; Chen, Yu -Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collectionmore » times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high

  6. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: Practical guidelines for the crystalline sponge method

    SciTech Connect

    Ramadhar, Timothy R.; Zheng, Shao -Liang; Chen, Yu -Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of

  7. Methods of Genomic Competency Integration in Practice

    PubMed Central

    Jenkins, Jean; Calzone, Kathleen A.; Caskey, Sarah; Culp, Stacey; Weiner, Marsha; Badzek, Laurie

    2015-01-01

    Purpose Genomics is increasingly relevant to health care, necessitating support for nurses to incorporate genomic competencies into practice. The primary aim of this project was to develop, implement, and evaluate a year-long genomic education intervention that trained, supported, and supervised institutional administrator and educator champion dyads to increase nursing capacity to integrate genomics through assessments of program satisfaction and institutional achieved outcomes. Design Longitudinal study of 23 Magnet Recognition Program® Hospitals (21 intervention, 2 controls) participating in a 1-year new competency integration effort aimed at increasing genomic nursing competency and overcoming barriers to genomics integration in practice. Methods Champion dyads underwent genomic training consisting of one in-person kick-off training meeting followed by monthly education webinars. Champion dyads designed institution-specific action plans detailing objectives, methods or strategies used to engage and educate nursing staff, timeline for implementation, and outcomes achieved. Action plans focused on a minimum of seven genomic priority areas: champion dyad personal development; practice assessment; policy content assessment; staff knowledge needs assessment; staff development; plans for integration; and anticipated obstacles and challenges. Action plans were updated quarterly, outlining progress made as well as inclusion of new methods or strategies. Progress was validated through virtual site visits with the champion dyads and chief nursing officers. Descriptive data were collected on all strategies or methods utilized, and timeline for achievement. Descriptive data were analyzed using content analysis. Findings The complexity of the competency content and the uniqueness of social systems and infrastructure resulted in a significant variation of champion dyad interventions. Conclusions Nursing champions can facilitate change in genomic nursing capacity through

  8. Doing Conversation Analysis: A Practical Guide.

    ERIC Educational Resources Information Center

    ten Have, Paul

    Noting that conversation analysis (CA) has developed into one of the major methods of analyzing speech in the disciplines of communications, linguistics, anthropology and sociology, this book demonstrates in a practical way how to become a conversation analyst. As well as providing an overall introduction to the approach, it focuses on the…

  9. Intermittent hypoxia training as non-pharmacologic therapy for cardiovascular diseases: Practical analysis on methods and equipment.

    PubMed

    Serebrovskaya, Tatiana V; Xi, Lei

    2016-09-01

    The global industrialization has brought profound lifestyle changes and environmental pollutions leading to higher risks of cardiovascular diseases. Such tremendous challenges outweigh the benefits of major advances in pharmacotherapies (such as statins, antihypertensive, antithrombotic drugs) and exacerbate the public healthcare burdens. One of the promising complementary non-pharmacologic therapies is the so-called intermittent hypoxia training (IHT) via activation of the human body's own natural defense through adaptation to intermittent hypoxia. This review article primarily focuses on the practical questions concerning the utilization of IHT as a non-pharmacologic therapy against cardiovascular diseases in humans. Evidence accumulated in the past five decades of research in healthy men and patients has suggested that short-term daily sessions consisting 3-4 bouts of 5-7 min exposures to 12-10% O2 alternating with normoxic durations for 2-3 weeks can result in remarkable beneficial effects in treatment of cardiovascular diseases such as hypertension, coronary heart disease, and heart failure. Special attentions are paid to the therapeutic effects of different IHT models, along with introduction of a variety of specialized facilities and equipment available for IHT, including hypobaric chambers, hypoxia gas mixture deliver equipment (rooms, tents, face masks), and portable rebreathing devices. Further clinical trials and thorough evaluations of the risks versus benefits of IHT are much needed to develop a series of standardized and practical guidelines for IHT. Taken together, we can envisage a bright future for IHT to play a more significant role in the preventive and complementary medicine against cardiovascular diseases. PMID:27407098

  10. Development and application to clinical practice of a validated HPLC method for the analysis of β-glucocerebrosidase in Gaucher disease.

    PubMed

    Colomer, E Gras; Gómez, M A Martínez; Alvarez, A González; Martí, M Climente; Moreno, P León; Zarzoso, M Fernández; Jiménez-Torres, N V

    2014-03-01

    The main objective of our study is to develop a simple, fast and reliable method for measuring β-glucocerebrosidase activity in Gaucher patients leukocytes in clinical practice. This measurement may be a useful marker to drive dose selection and early clinical decision making of enzyme replacement therapy. We measure the enzyme activity by high-performance liquid chromatography with ultraviolet detection and 4-nitrophenyl-β-d-glucopyranoside as substrate. A cohort of eight Gaucher patients treated with enzyme replacement therapy and ten healthy controls were tested; median enzyme activity values was 20.57mU/ml (interquartile range 19.92-21.53mU/ml) in patients and mean was 24.73mU/ml (24.12-25.34mU/ml) in the reference group, which allowed the establishment of the normal range of β-glucocerebrosidase activity. The proposed method for leukocytes glucocerebrosidase activity measuring is fast, easy to use, inexpensive and reliable. Furthermore, significant differences between both populations were observed (p=0.008). This suggests that discerning between patients and healthy individuals and providing an approach to enzyme dosage optimization is feasible. This method could be considered as a decision support tool for clinical monitoring. Our study is a first approach to in depth analysis of enzyme replacement therapy and optimization of dosing therapies. PMID:24447963

  11. Assessment of management in general practice: validation of a practice visit method.

    PubMed Central

    van den Hombergh, P; Grol, R; van den Hoogen, H J; van den Bosch, W J

    1998-01-01

    BACKGROUND: Practice management (PM) in general practice is as yet ill-defined; a systematic description of its domain, as well as a valid method to assess it, are necessary for research and assessment. AIM: To develop and validate a method to assess PM of general practitioners (GPs) and practices. METHOD: Relevant and potentially discriminating indicators were selected from a systematic framework of 2410 elements of PM to be used in an assessment method (VIP = visit instrument PM). The method was first tested in a pilot study and, after revision, was evaluated in order to select discriminating indicators and to determine validity of dimensions (factor and reliability analysis, linear regression). RESULTS: One hundred and ten GPs were assessed with the practice visit method using 249 indicators; 208 of these discriminated sufficiently at practice level or at GP level. Factor analysis resulted in 34 dimensions and in a taxonomy of PM. Dimensions and indicators showed marked variation between GPs and practices. Training practices scored higher on five dimensions; single-handed and dispensing practices scored lower on delegated tasks, but higher on accessibility and availability. CONCLUSION: A visit method to assess PM has been developed and its validity studied systematically. The taxonomy and dimensions of PM were in line with other classifications. Selection of a balanced number of useful and relevant indicators was nevertheless difficult. The dimensions could discriminate between groups of GPs and practices, establishing the value of the method for assessment. The VIP method could be an important contribution to the introduction of continuous quality improvement in the profession. PMID:10198481

  12. Statistical log analysis made practical

    SciTech Connect

    Mitchell, W.K.; Nelson, R.J. )

    1991-06-01

    This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.

  13. Exergy analysis: Principles and practice

    NASA Astrophysics Data System (ADS)

    Moran, M. J.; Sciubba, E.

    1994-04-01

    The importance of the goal of developing systems that effectively use nonrenewable energy resources such as oil, natural gas, and coal is apparent. The method of exergy analysis is well suited for furthering this goal, for it enables the location, type and true magnitude of waste and loss to be determined. Such information can be used to design new systems and to reduce the inefficiency of existing systems. This paper provides a brief survey of both exergy principles and the current literature of exergy analysis with emphasis on areas of application.

  14. A Method for Optimizing Waste Management and Disposal Practices Using a Group-Based Uncertainty Model for the Analysis of Characterization Data - 13191

    SciTech Connect

    Simpson, A.; Clapham, M.; Lucero, R.; West, J.

    2013-07-01

    It is a universal requirement for characterization of radioactive waste, that the consignor shall calculate and report a Total Measurement Uncertainty (TMU) value associated with each of the measured quantities such as nuclide activity. For Non-destructive Assay systems, the TMU analysis is typically performed on an individual container basis. However, in many cases, the waste consignor treats, transports, stores and disposes of containers in groups for example by over-packing smaller containers into a larger container or emplacing containers into groups for final disposal. The current standard practice for container-group data analysis is usually to treat each container as independent and uncorrelated and use a simple summation / averaging method (or in some cases summation of TMU in quadrature) to define the overall characteristics and associated uncertainty in the container group. In reality, many groups of containers are assayed on the same system, so there will be a large degree of co-dependence in the individual uncertainty elements. Many uncertainty terms may be significantly reduced when addressing issues such as the source position and variability in matrix contents over large populations. The systematic terms encompass both inherently 'two-directional' random effects (e.g. variation of source position) and other terms that are 'one-directional' i.e. designed to account for potential sources of bias. An analysis has been performed with population groups of a variety of non-destructive assay platforms in order to define a quantitative mechanism for waste consignors to determine overall TMU for batches of containers that have been assayed on the same system. (authors)

  15. Evaluation of agricultural best-management practices in the Conestoga River headwaters, Pennsylvania; methods of data collection and analysis and description of study areas

    USGS Publications Warehouse

    Chichester, Douglas C.

    1988-01-01

    The U.S. Geological Survey is conducting a water quality study as part of the nationally implemented Rural Clean Water Program in the headwaters of the Conestoga River, Pennsylvania. The study, which began in 1982, was designed to determine the effect of agricultural best management practices on surface--and groundwater quality. The study was concentrated in four areas within the intensively farmed, carbonate rock terrane located predominately in Lancaster County, Pennsylvania. These areas were divided into three monitoring components: (1) a Regional study area (188 sq mi): (2) a Small Watershed study area (5.82 sq mi); and (3) two field site study areas, Field-Site 1 (22.1 acres) and Field 2 (47.5 acres). The type of water quality data and the methods of data collection and analysis are presented. The monitoring strategy and description of the study areas are discussed. The locations and descriptions for all data collection locations at the four study areas are provided. (USGS)

  16. The Sherlock Holmes method in clinical practice.

    PubMed

    Sopeña, B

    2014-04-01

    This article lists the integral elements of the Sherlock Holmes method, which is based on the intelligent collection of information through detailed observation, careful listening and thorough examination. The information thus obtained is analyzed to develop the main and alternative hypotheses, which are shaped during the deductive process until the key leading to the solution is revealed. The Holmes investigative method applied to clinical practice highlights the advisability of having physicians reason through and seek out the causes of the disease with the data obtained from acute observation, a detailed review of the medical history and careful physical examination. PMID:24457141

  17. Selecting Needs Analysis Methods.

    ERIC Educational Resources Information Center

    Newstrom, John W.; Lilyquist, John M.

    1979-01-01

    Presents a contingency model for decision making with regard to needs analysis methods. Focus is on 12 methods with brief discussion of their defining characteristics and some operational guidelines for their use. (JOW)

  18. Practical Considerations for Using Exploratory Factor Analysis in Educational Research

    ERIC Educational Resources Information Center

    Beavers, Amy S.; Lounsbury, John W.; Richards, Jennifer K.; Huck, Schuyler W.; Skolits, Gary J.; Esquivel, Shelley L.

    2013-01-01

    The uses and methodology of factor analysis are widely debated and discussed, especially the issues of rotational use, methods of confirmatory factor analysis, and adequate sample size. The variety of perspectives and often conflicting opinions can lead to confusion among researchers about best practices for using factor analysis. The focus of the…

  19. Guidance for using mixed methods design in nursing practice research.

    PubMed

    Chiang-Hanisko, Lenny; Newman, David; Dyess, Susan; Piyakong, Duangporn; Liehr, Patricia

    2016-08-01

    The mixed methods approach purposefully combines both quantitative and qualitative techniques, enabling a multi-faceted understanding of nursing phenomena. The purpose of this article is to introduce three mixed methods designs (parallel; sequential; conversion) and highlight interpretive processes that occur with the synthesis of qualitative and quantitative findings. Real world examples of research studies conducted by the authors will demonstrate the processes leading to the merger of data. The examples include: research questions; data collection procedures and analysis with a focus on synthesizing findings. Based on experience with mixed methods studied, the authors introduce two synthesis patterns (complementary; contrasting), considering application for practice and implications for research. PMID:27397810

  20. Practical approaches for design and analysis of clinical trials of infertility treatments: crossover designs and the Mantel-Haenszel method are recommended.

    PubMed

    Takada, Michihiro; Sozu, Takashi; Sato, Tosiya

    2015-01-01

    Crossover designs have some advantages over standard clinical trial designs and they are often used in trials evaluating the efficacy of treatments for infertility. However, clinical trials of infertility treatments violate a fundamental condition of crossover designs, because women who become pregnant in the first treatment period are not treated in the second period. In previous research, to deal with this problem, some new designs, such as re-randomization designs, and analysis methods including the logistic mixture model and the beta-binomial mixture model were proposed. Although the performance of these designs and methods has previously been evaluated in large-scale clinical trials with sample sizes of more than 1000 per group, the actual sample sizes of infertility treatment trials are usually around 100 per group. The most appropriate design and analysis for these moderate-scale clinical trials are currently unclear. In this study, we conducted simulation studies to determine the appropriate design and analysis method of moderate-scale clinical trials for irreversible endpoints by evaluating the statistical power and bias in the treatment effect estimates. The Mantel-Haenszel method had similar power and bias to the logistic mixture model. The crossover designs had the highest power and the smallest bias. We recommend using a combination of the crossover design and the Mantel-Haenszel method for two-period, two-treatment clinical trials with irreversible endpoints. PMID:25776032

  1. An Online Forum As a Qualitative Research Method: Practical Issues

    PubMed Central

    Im, Eun-Ok; Chee, Wonshik

    2008-01-01

    Background Despite positive aspects of online forums as a qualitative research method, very little is known about practical issues involved in using online forums for data collection, especially for a qualitative research project. Objectives The purpose of this paper is to describe the practical issues that the researchers encountered in implementing an online forum as a qualitative component of a larger study on cancer pain experience. Method Throughout the study process, the research staff recorded issues ranged from minor technical problems to serious ethical dilemmas as they arose and wrote memos about them. The memos and written records of discussions were reviewed and analyzed using the content analysis suggested by Weber. Results Two practical issues related to credibility were identified: a high response and retention rate and automatic transcripts. An issue related to dependability was the participants’ easy forgetfulness. The issues related to confirmability were difficulties in theoretical saturation and unstandardized computer and Internet jargon. A security issue related to hacking attempts was noted as well. Discussion The analysis of these issues suggests several implications for future researchers who want to use online forums as a qualitative data collection method. PMID:16849979

  2. Comprehensive rotorcraft analysis methods

    NASA Technical Reports Server (NTRS)

    Stephens, Wendell B.; Austin, Edward E.

    1988-01-01

    The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).

  3. System based practice: a concept analysis

    PubMed Central

    YAZDANI, SHAHRAM; HOSSEINI, FAKHROLSADAT; AHMADY, SOLEIMAN

    2016-01-01

    Introduction Systems-Based Practice (SBP) is one of the six competencies introduced by the ACGME for physicians to provide high quality of care and also the most challenging of them in performance, training, and evaluation of medical students. This concept analysis clarifies the concept of SBP by identifying its components to make it possible to differentiate it from other similar concepts. For proper training of SBP and to ensure these competencies in physicians, it is necessary to have an operational definition, and SBP’s components must be precisely defined in order to provide valid and reliable assessment tools. Methods Walker & Avant’s approach to concept analysis was performed in eight stages: choosing a concept, determining the purpose of analysis, identifying all uses of the concept, defining attributes, identifying a model case, identifying borderline, related, and contrary cases, identifying antecedents and consequences, and defining empirical referents. Results Based on the analysis undertaken, the attributes of SBP includes knowledge of the system, balanced decision between patients’ need and system goals, effective role playing in interprofessional health care team, system level of health advocacy, and acting for system improvement. System thinking and a functional system are antecedents and system goals are consequences. A case model, as well as border, and contrary cases of SBP, has been introduced. Conclusion he identification of SBP attributes in this study contributes to the body of knowledge in SBP and reduces the ambiguity of this concept to make it possible for applying it in training of different medical specialties. Also, it would be possible to develop and use more precise tools to evaluate SBP competency by using empirical referents of the analysis. PMID:27104198

  4. Achieving Integration in Mixed Methods Designs—Principles and Practices

    PubMed Central

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-01-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs—exploratory sequential, explanatory sequential, and convergent—and through four advanced frameworks—multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

  5. A collection of research reporting, theoretical analysis, and practical applications in science education: Examining qualitative research methods, action research, educator-researcher partnerships, and constructivist learning theory

    NASA Astrophysics Data System (ADS)

    Hartle, R. Todd

    2007-12-01

    Educator-researcher partnerships are increasingly being used to improve the teaching of science. Chapter 1 provides a summary of the literature concerning partnerships, and examines the justification of qualitative methods in studying these relationships. It also justifies the use of Participatory Action Research (PAR). Empirically-based studies of educator-researcher partnership relationships are rare despite investments in their implementation by the National Science Foundation (NSF) and others. Chapter 2 describes a qualitative research project in which participants in an NSF GK-12 fellowship program were studied using informal observations, focus groups, personal interviews, and journals to identify and characterize the cultural factors that influenced the relationships between the educators and researchers. These factors were organized into ten critical axes encompassing a range of attitudes, behaviors, or values defined by two stereotypical extremes. These axes were: (1) Task Dictates Context vs. Context Dictates Task; (2) Introspection vs. Extroversion; (3) Internal vs. External Source of Success; (4) Prior Planning vs. Implementation Flexibility; (5) Flexible vs. Rigid Time Sense; (6) Focused Time vs. Multi-tasking; (7) Specific Details vs. General Ideas; (8) Critical Feedback vs. Encouragement; (9) Short Procedural vs. Long Content Repetition; and (10) Methods vs. Outcomes are Well Defined. Another ten important stereotypical characteristics, which did not fit the structure of an axis, were identified and characterized. The educator stereotypes were: (1) Rapport/Empathy; (2) Like Kids; (3) People Management; (4) Communication Skills; and (5) Entertaining. The researcher stereotypes were: (1) Community Collaboration; (2) Focus Intensity; (3) Persistent; (4) Pattern Seekers; and (5) Curiosity/Skeptical. Chapter 3 summarizes the research presented in chapter 2 into a practical guide for participants and administrators of educator-researcher partnerships

  6. Systemic accident analysis: examining the gap between research and practice.

    PubMed

    Underwood, Peter; Waterson, Patrick

    2013-06-01

    The systems approach is arguably the dominant concept within accident analysis research. Viewing accidents as a result of uncontrolled system interactions, it forms the theoretical basis of various systemic accident analysis (SAA) models and methods. Despite the proposed benefits of SAA, such as an improved description of accident causation, evidence within the scientific literature suggests that these techniques are not being used in practice and that a research-practice gap exists. The aim of this study was to explore the issues stemming from research and practice which could hinder the awareness, adoption and usage of SAA. To achieve this, semi-structured interviews were conducted with 42 safety experts from ten countries and a variety of industries, including rail, aviation and maritime. This study suggests that the research-practice gap should be closed and efforts to bridge the gap should focus on ensuring that systemic methods meet the needs of practitioners and improving the communication of SAA research. PMID:23542136

  7. Discourse analysis in general practice: a sociolinguistic approach.

    PubMed

    Nessa, J; Malterud, K

    1990-06-01

    It is a simple but important fact that as general practitioners we talk to our patients. The quality of the conversation is of vital importance for the outcome of the consultation. The purpose of this article is to discuss a methodological tool borrowed from sociolinguistics--discourse analysis. To assess the suitability of this method for analysis of general practice consultations, the authors have performed a discourse analysis of one single consultation. Our experiences are presented here. PMID:2369986

  8. Description of practice as an ambulatory care nurse: psychometric properties of a practice-analysis survey.

    PubMed

    Baghi, Heibatollah; Panniers, Teresa L; Smolenski, Mary C

    2007-01-01

    Changes within nursing demand that a specialty conduct periodic, appropriate practice analyses to continually validate itself against preset standards. This study explicates practice analysis methods using ambulatory care nursing as an exemplar. Data derived from a focus group technique were used to develop a survey that was completed by 499 ambulatory care nurses. The validity of the instrument was assessed using principal components analysis; reliability was estimated using Cronbach's alpha coefficient. The focus group with ambulatory care experts produced 34 knowledge and activity statements delineating ambulatory care nursing practice. The survey data produced five factors accounting for 71% of variance in the data. The factors were identified as initial patient assessment, professional nursing issues and standards, client care management skills, technical/clinical skills, and system administrative operations. It was concluded that practice analyses delineate a specialty and provide input for certification examinations aimed at measuring excellence in a field of nursing. PMID:17665821

  9. Science Teaching Methods: A Rationale for Practices

    ERIC Educational Resources Information Center

    Osborne, Jonathan

    2011-01-01

    This article is a version of the talk given by Jonathan Osborne as the Association for Science Education (ASE) invited lecturer at the National Science Teachers' Association Annual Convention in San Francisco, USA, in April 2011. The article provides an explanatory justification for teaching about the practices of science in school science that…

  10. Method of analysis at the U.S. Geological Survey California Water Science Center, Sacramento Laboratory - determination of haloacetic acid formation potential, method validation, and quality-control practices

    USGS Publications Warehouse

    Zazzi, Barbara C.; Crepeau, Kathryn L.; Fram, Miranda S.; Bergamaschi, Brian A.

    2005-01-01

    An analytical method for the determination of haloacetic acid formation potential of water samples has been developed by the U.S. Geological Survey California Water Science Center Sacramento Laboratory. The haloacetic acid formation potential is measured by dosing water samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine. The haloacetic acids formed are bromochloroacetic acid, bromodichloroacetic acid, dibromochloroacetic acid, dibromoacetic acid, dichloroacetic acid, monobromoacetic acid, monochloroacetic acid, tribromoacetic acid, and trichloroacetic acid. They are extracted, methylated, and then analyzed using a gas chromatograph equipped with an electron capture detector. Method validation experiments were performed to determine the method accuracy, precision, and detection limit for each of the compounds. Method detection limits for these nine haloacetic acids ranged from 0.11 to 0.45 microgram per liter. Quality-control practices include the use of blanks, quality-control samples, calibration verification standards, surrogate recovery, internal standard, matrix spikes, and duplicates.

  11. Method of analysis and quality-assurance practices by the U. S. Geological Survey Organic Geochemistry Research Group; determination of four selected mosquito insecticides and a synergist in water using liquid-liquid extraction and gas chrom

    USGS Publications Warehouse

    Zimmerman, L.R.; Strahan, A.P.; Thurman, E.M.

    2001-01-01

    A method of analysis and quality-assurance practices were developed for the determination of four mosquito insecticides (malathion, metho-prene, phenothrin, and resmethrin) and one synergist (piperonyl butoxide) in water. The analytical method uses liquid-liquid extraction (LLE) and gas chromatography/mass spectrometry (GC/MS). Good precision and accuracy were demonstrated in reagent water, urban surface water, and ground water. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 50 nanograms per liter ranged from 68 to 171 percent, with standard deviations in concentrations of 27 nanograms per liter or less. The method detection limit for all compounds was 5.9 nanograms per liter or less for 247-milliliter samples. This method is valuable for acquiring information about the fate and transport of these mosquito insecticides and one synergist in water.

  12. Generalized Multicoincidence Analysis Methods

    SciTech Connect

    Warren, Glen A.; Smith, Leon E.; Aalseth, Craig E.; Ellis, J. E.; Valsan, Andrei B.; Mengesha, Wondwosen

    2005-10-01

    The ability to conduct automated trace radionuclide analysis at or near the sample collection point would provide a valuable tool for emergency response, nuclear forensics and environmental monitoring. Pacific Northwest National Laboratory is developing systems for this purpose based on dual gamma-ray spectrometers, e.g. NaI(TI) or HPGe, combined with thin organic scintillator sensors to detect light charged particles. Translating the coincident signatures recorded by these systems, which include , and , into the concentration of detectable radionuclides in the sample requires generalized multicoincidence analysis tools. The development and validation of the Coincidence Lookup Library, which currently contains the probabilities of single and coincidence signatures from more than 420 isotopes, is described. Also discussed is a method to calculate the probability of observing a coincidence signature which incorporates true coincidence summing effects. These effects are particularly important for high-geometric-efficiency detection systems. Finally, a process for validating the integrated analysis software package is demonstrated using GEANT 4 simulations of the prototype detector systems.

  13. Generalized Multicoincidence Analysis Methods

    SciTech Connect

    Warren, Glen A.; Smith, Leon E.; Aalseth, Craig E.; Ellis, J. E.; Valsan, Andrei B.; Mengesha, Wondwosen

    2006-02-01

    The ability to conduct automated trace radionuclide analysis at or near the sample collection point would provide a valuable tool for emergency response, environmental monitoring, and verification of treaties and agreements. Pacific Northwest National Laboratory is developing systems for this purpose based on dual gamma-ray spectrometers, e.g. NaI(TI) or HPGe, combined with thin organic scintillator sensors to detect light charged particles. Translating the coincident signatures recorded by these systems, which include beta-gamma, gamma-gamma and beta-gamma-gamma, into the concentration of detectable radionuclides in the sample requires generalized multicoincidence analysis tools. The development and validation of the Coincidence Lookup Library, which currently contains the probabilities of single and coincidence signatures from more than 420 isotopes, is described. Also discussed is a method to calculate the probability of observing a coincidence signature which incorporates true coincidence summing effects. These effects are particularly important for high-geometric-efficiency detection systems. Finally, a process for verifying the integrated analysis software package is demonstrated using GEANT 4 simulations of the prototype detector systems.

  14. [The analysis of the medication error, in practice].

    PubMed

    Didelot, Nicolas; Cistio, Céline

    2016-01-01

    By performing a systemic analysis of medication errors which occur in practice, the multidisciplinary teams can avoid a reoccurrence with the aid of an improvement action plan. The methods must take into account all the factors which might have contributed to or favoured the occurrence of a medication incident or accident. PMID:27177485

  15. Exploratory and Confirmatory Analysis of the Trauma Practices Questionnaire

    ERIC Educational Resources Information Center

    Craig, Carlton D.; Sprang, Ginny

    2009-01-01

    Objective: The present study provides psychometric data for the Trauma Practices Questionnaire (TPQ). Method: A nationally randomized sample of 2,400 surveys was sent to self-identified trauma treatment specialists, and 711 (29.6%) were returned. Results: An exploratory factor analysis (N = 319) conducted on a randomly split sample (RSS) revealed…

  16. Council on Certification Professional Practice Analysis.

    PubMed

    Zaglaniczny, K L

    1993-06-01

    The CCNA has completed a PPA and will begin implementing its recommendations with the December 1993 certification examination. The results of the PPA provide content validation for the CCNA certification examination. The certification examination is reflective of the knowledge and skill required for entry-level practice. Assessment of this knowledge is accomplished through the use of questions that are based on the areas represented in the content outline. Analysis of the PPA has resulted in changes in the examination content outline and percentages of questions in each area to reflect current entry-level nurse anesthesia practice. The new outline is based on the major domains of knowledge required for nurse anesthesia practice. These changes are justified by the consistency in the responses of the practitioners surveyed. There was overall agreement as to the knowledge and skills related to patient conditions, procedures, agents, techniques, and equipment that an entry-level CRNA must have to practice. Members of the CCNA and Examination Committee will use the revised outline to develop questions for the certification examination. The questions will be focused on the areas identified as requiring high levels of expertise and those that appeared higher in frequency. The PPA survey will be used as a basis for subsequent content validation studies. It will be revised to reflect new knowledge, technology, and techniques related to nurse anesthesia practice. The CCNA has demonstrated its commitment to the certification process through completion of the PPA and implementation of changes in the structure of the examination. PMID:8291387

  17. Scenistic Methods for Training: Applications and Practice

    ERIC Educational Resources Information Center

    Lyons, Paul R.

    2011-01-01

    Purpose: This paper aims to complement an earlier article (2010) in "Journal of European Industrial Training" in which the description and theory bases of scenistic methods were presented. This paper also offers a description of scenistic methods and information on theory bases. However, the main thrust of this paper is to describe, give suggested…

  18. A practical method for sensor absolute calibration.

    PubMed

    Meisenholder, G W

    1966-04-01

    This paper describes a method of performing sensor calibrations using an NBS standard of spectral irradiance. The method shown, among others, was used for calibration of the Mariner IV Canopus sensor. Agreement of inflight response to preflight calibrations performed by this technique has been found to be well within 10%. PMID:20048890

  19. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1929-01-01

    This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

  20. Using Developmental Evaluation Methods with Communities of Practice

    ERIC Educational Resources Information Center

    van Winkelen, Christine

    2016-01-01

    Purpose: This paper aims to explore the use of developmental evaluation methods with community of practice programmes experiencing change or transition to better understand how to target support resources. Design/methodology/approach: The practical use of a number of developmental evaluation methods was explored in three organizations over a…

  1. Teachers' Beliefs and Technology Practices: A Mixed-Methods Approach

    ERIC Educational Resources Information Center

    Palak, Deniz; Walls, Richard T.

    2009-01-01

    In a sequential mixed methods design, we sought to examine the relationship between teachers' beliefs and their instructional technology practices among technology-using teachers who worked at technology-rich schools to ultimately describe if change in practice toward a student-centered paradigm occurred. The integrated mixed-methods results…

  2. Airphoto analysis of erosion control practices

    NASA Technical Reports Server (NTRS)

    Morgan, K. M.; Morris-Jones, D. R.; Lee, G. B.; Kiefer, R. W.

    1980-01-01

    The Universal Soil Loss Equation (USLE) is a widely accepted tool for erosion prediction and conservation planning. In this study, airphoto analysis of color and color infrared 70 mm photography at a scale of 1:60,000 was used to determine the erosion control practice factor in the USLE. Information about contour tillage, contour strip cropping, and grass waterways was obtained from aerial photography for Pheasant Branch Creek watershed in Dane County, Wisconsin.

  3. Practice-Focused Ethnographies of Higher Education: Method/ological Corollaries of a Social Practice Perspective

    ERIC Educational Resources Information Center

    Trowler, Paul Richard

    2014-01-01

    Social practice theory addresses both theoretical and method/ological agendas. To date priority has been given to the former, with writing on the latter tending often to be an afterthought to theoretical expositions or fieldwork accounts. This article gives sustained attention to the method/ological corollaries of a social practice perspective. It…

  4. A Practical Guide to Immunoassay Method Validation

    PubMed Central

    Andreasson, Ulf; Perret-Liaudet, Armand; van Waalwijk van Doorn, Linda J. C.; Blennow, Kaj; Chiasserini, Davide; Engelborghs, Sebastiaan; Fladby, Tormod; Genc, Sermin; Kruse, Niels; Kuiperij, H. Bea; Kulic, Luka; Lewczuk, Piotr; Mollenhauer, Brit; Mroczko, Barbara; Parnetti, Lucilla; Vanmechelen, Eugeen; Verbeek, Marcel M.; Winblad, Bengt; Zetterberg, Henrik; Koel-Simmelink, Marleen; Teunissen, Charlotte E.

    2015-01-01

    Biochemical markers have a central position in the diagnosis and management of patients in clinical medicine, and also in clinical research and drug development, also for brain disorders, such as Alzheimer’s disease. The enzyme-linked immunosorbent assay (ELISA) is frequently used for measurement of low-abundance biomarkers. However, the quality of ELISA methods varies, which may introduce both systematic and random errors. This urges the need for more rigorous control of assay performance, regardless of its use in a research setting, in clinical routine, or drug development. The aim of a method validation is to present objective evidence that a method fulfills the requirements for its intended use. Although much has been published on which parameters to investigate in a method validation, less is available on a detailed level on how to perform the corresponding experiments. To remedy this, standard operating procedures (SOPs) with step-by-step instructions for a number of different validation parameters is included in the present work together with a validation report template, which allow for a well-ordered presentation of the results. Even though the SOPs were developed with the intended use for immunochemical methods and to be used for multicenter evaluations, most of them are generic and can be used for other technologies as well. PMID:26347708

  5. ALFRED: A Practical Method for Alignment-Free Distance Computation.

    PubMed

    Thankachan, Sharma V; Chockalingam, Sriram P; Liu, Yongchao; Apostolico, Alberto; Aluru, Srinivas

    2016-06-01

    Alignment-free approaches are gaining persistent interest in many sequence analysis applications such as phylogenetic inference and metagenomic classification/clustering, especially for large-scale sequence datasets. Besides the widely used k-mer methods, the average common substring (ACS) approach has emerged to be one of the well-known alignment-free approaches. Two recent works further generalize this ACS approach by allowing a bounded number k of mismatches in the common substrings, relying on approximation (linear time) and exact computation, respectively. Albeit having a good worst-case time complexity [Formula: see text], the exact approach is complex and unlikely to be efficient in practice. Herein, we present ALFRED, an alignment-free distance computation method, which solves the generalized common substring search problem via exact computation. Compared to the theoretical approach, our algorithm is easier to implement and more practical to use, while still providing highly competitive theoretical performances with an expected run-time of [Formula: see text]. By applying our program to phylogenetic inference as a case study, we find that our program facilitates to exactly reconstruct the topology of the reference phylogenetic tree for a set of 27 primate mitochondrial genomes, at reasonably acceptable speed. ALFRED is implemented in C++ programming language and the source code is freely available online. PMID:27138275

  6. Pragmatism in practice: mixed methods research for physiotherapy.

    PubMed

    Shaw, James A; Connelly, Denise M; Zecevic, Aleksandra A

    2010-11-01

    The purpose of this paper is to provide an argument for the place of mixed methods research across practice settings as an effective means of supporting evidence-based practice in physiotherapy. Physiotherapy practitioners use both qualitative and quantitative methods throughout the process of patient care-from history taking, assessment, and intervention to evaluation of outcomes. Research on practice paradigms demonstrates the importance of mixing qualitative and quantitative methods to achieve 'expert practice' that is concerned with optimizing outcomes and incorporating patient beliefs and values. Research paradigms that relate to this model of practice would integrate qualitative and quantitative types of knowledge and inquiry, while maintaining a prioritized focus on patient outcomes. Pragmatism is an emerging research paradigm where practical consequences and the effects of concepts and behaviors are vital components of meaning and truth. This research paradigm supports the simultaneous use of qualitative and quantitative methods of inquiry to generate evidence to support best practice. This paper demonstrates that mixed methods research with a pragmatist view provides evidence that embraces and addresses the multiple practice concerns of practitioners better than either qualitative or quantitative research approaches in isolation. PMID:20649500

  7. Practical aspects of spatially high accurate methods

    NASA Technical Reports Server (NTRS)

    Godfrey, Andrew G.; Mitchell, Curtis R.; Walters, Robert W.

    1992-01-01

    The computational qualities of high order spatially accurate methods for the finite volume solution of the Euler equations are presented. Two dimensional essentially non-oscillatory (ENO), k-exact, and 'dimension by dimension' ENO reconstruction operators are discussed and compared in terms of reconstruction and solution accuracy, computational cost and oscillatory behavior in supersonic flows with shocks. Inherent steady state convergence difficulties are demonstrated for adaptive stencil algorithms. An exact solution to the heat equation is used to determine reconstruction error, and the computational intensity is reflected in operation counts. Standard MUSCL differencing is included for comparison. Numerical experiments presented include the Ringleb flow for numerical accuracy and a shock reflection problem. A vortex-shock interaction demonstrates the ability of the ENO scheme to excel in simulating unsteady high-frequency flow physics.

  8. Practice and Evaluation of Blended Learning with Cross-Cultural Distance Learning in a Foreign Language Class: Using Mix Methods Data Analysis

    ERIC Educational Resources Information Center

    Sugie, Satoko; Mitsugi, Makoto

    2014-01-01

    The Information and Communication Technology (ICT) utilization in Chinese as a "second" foreign language has mainly been focused on Learning Management System (LMS), digital material development, and quantitative analysis of learners' grammatical knowledge. There has been little research that has analyzed the effectiveness of…

  9. Presence in nursing practice: a concept analysis.

    PubMed

    Hessel, Joy A

    2009-01-01

    Presence is an elusive concept in nursing practice that has been recognized as advantageous in the patient experience. Dictionary sources define presence as being with and attending to another; involvement, companionship. Nursing scholars and theorists have elaborated on the dictionary definition of presence to include a holistic definition inclusive of the patient experience and the connection experienced between both patient and provider. However, despite attempts to define presence as it relates to nursing practice, a definition that completely encompasses the substantial benefits on the patient experience is yet to be developed. As guided by Walker and Avant, this concept analysis was performed by selection of a concept, determination of the purpose of the analysis, evaluation of existing definitions, identification of defining attributes of the concept, formulation of patient cases that epitomize and contrast the concept, and identification of antecedents and empirical referents of the concept. Thus, in this concept analysis article, existing definitions of presence will be recognized and evaluated, cases demonstrating nursing presence explored, and a definition of presence in nursing developed. PMID:19713785

  10. Practice-Near and Practice-Distant Methods in Human Services Research

    ERIC Educational Resources Information Center

    Froggett, Lynn; Briggs, Stephen

    2012-01-01

    This article discusses practice-near research in human services, a cluster of methodologies that may include thick description, intensive reflexivity, and the study of emotional and relational processes. Such methods aim to get as near as possible to experiences at the relational interface between institutions and the practice field.…

  11. Reflections on Experiential Teaching Methods: Linking the Classroom to Practice

    ERIC Educational Resources Information Center

    Wehbi, Samantha

    2011-01-01

    This article explores the use of experiential teaching methods in social work education. The literature demonstrates that relying on experiential teaching methods in the classroom can have overwhelmingly positive learning outcomes; however, not much is known about the possible effect of these classroom methods on practice. On the basis of…

  12. Communication Network Analysis Methods.

    ERIC Educational Resources Information Center

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  13. Optimizing Distributed Practice: Theoretical Analysis and Practical Implications

    ERIC Educational Resources Information Center

    Cepeda, Nicholas J.; Coburn, Noriko; Rohrer, Doug; Wixted, John T.; Mozer, Michael C,; Pashler, Harold

    2009-01-01

    More than a century of research shows that increasing the gap between study episodes using the same material can enhance retention, yet little is known about how this so-called distributed practice effect unfolds over nontrivial periods. In two three-session laboratory studies, we examined the effects of gap on retention of foreign vocabulary,…

  14. Development of a method to analyze orthopaedic practice expenses.

    PubMed

    Brinker, M R; Pierce, P; Siegel, G

    2000-03-01

    The purpose of the current investigation was to present a standard method by which an orthopaedic practice can analyze its practice expenses. To accomplish this, a five-step process was developed to analyze practice expenses using a modified version of activity-based costing. In this method, general ledger expenses were assigned to 17 activities that encompass all the tasks and processes typically performed in an orthopaedic practice. These 17 activities were identified in a practice expense study conducted for the American Academy of Orthopaedic Surgeons. To calculate the cost of each activity, financial data were used from a group of 19 orthopaedic surgeons in Houston, Texas. The activities that consumed the largest portion of the employee work force (person hours) were service patients in office (25.0% of all person hours), maintain medical records (13.6% of all person hours), and resolve collection disputes and rebill charges (12.3% of all person hours). The activities that comprised the largest portion of the total expenses were maintain facility (21.4%), service patients in office (16.0%), and sustain business by managing and coordinating practice (13.8%). The five-step process of analyzing practice expenses was relatively easy to perform and it may be used reliably by most orthopaedic practices. PMID:10738440

  15. Complementing Gender Analysis Methods.

    PubMed

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital. PMID:25941756

  16. Methods of analysis and quality-assurance practices of the U.S. Geological Survey organic laboratory, Sacramento, California; determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Domagalski, Joseph L.; Kuivila, Kathryn M.

    1994-01-01

    Analytical method and quality-assurance practices were developed for a study of the fate and transport of pesticides in the Sacramento-San Joaquin Delta and the Sacramento and San Joaquin River. Water samples were filtered to remove suspended parti- culate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide, and the pesticides were eluted with three 2-milliliter aliquots of hexane:diethyl ether (1:1). The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for analytes determined per 1,500-milliliter samples ranged from 0.006 to 0.047 microgram per liter. Recoveries ranged from 47 to 89 percent for 12 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.05 and 0.26 microgram per liter. The method was modified to improve the pesticide recovery by reducing the sample volume to 1,000 milliliters. Internal standards were added to improve quantitative precision and accuracy. The analysis also was expanded to include a total of 21 pesticides. The method detection limits for 1,000-milliliter samples ranged from 0.022 to 0.129 microgram per liter. Recoveries ranged from 38 to 128 percent for 21 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.10 and 0.75 microgram per liter.

  17. Practical Issues in Component Aging Analysis

    SciTech Connect

    Dana L. Kelly; Andrei Rodionov; Jens Uwe-Klugel

    2008-09-01

    This paper examines practical issues in the statistical analysis of component aging data. These issues center on the stochastic process chosen to model component failures. The two stochastic processes examined are repair same as new, leading to a renewal process, and repair same as old, leading to a nonhomogeneous Poisson process. Under the first assumption, times between failures can treated as statistically independent observations from a stationary process. The common distribution of the times between failures is called the renewal distribution. Under the second process, the times between failures will not be independently and identically distributed, and one cannot simply fit a renewal distribution to the cumulative failure times or the times between failures. The paper illustrates how the assumption made regarding the repair process is crucial to the analysis. Besides the choice of stochastic process, other issues that are discussed include qualitative graphical analysis and simple nonparametric hypothesis tests to help judge which process appears more appropriate. Numerical examples are presented to illustrate the issues discussed in the paper.

  18. Learning to Teach within Practice-Based Methods Courses

    ERIC Educational Resources Information Center

    Kazemi, Elham; Waege, Kjersti

    2015-01-01

    Supporting prospective teachers to enact high quality instruction requires transforming their methods preparation. This study follows three teachers through a practice-based elementary methods course. Weekly class sessions took place in an elementary school. The setting afforded opportunities for prospective teachers to engage in cycles of…

  19. Multi-criteria decision analysis: Limitations, pitfalls, and practical difficulties

    SciTech Connect

    Kujawski, Edouard

    2003-02-01

    The 2002 Winter Olympics women's figure skating competition is used as a case study to illustrate some of the limitations, pitfalls, and practical difficulties of Multi-Criteria Decision Analysis (MCDA). The paper compares several widely used models for synthesizing the multiple attributes into a single aggregate value. The various MCDA models can provide conflicting rankings of the alternatives for a common set of information even under states of certainty. Analysts involved in MCDA need to deal with the following challenging tasks: (1) selecting an appropriate analysis method, and (2) properly interpreting the results. An additional trap is the availability of software tools that implement specific MCDA models that can beguile the user with quantitative scores. These conclusions are independent of the decision domain and they should help foster better MCDA practices in many fields including systems engineering trade studies.

  20. Breath analysis: translation into clinical practice.

    PubMed

    Brodrick, Emma; Davies, Antony; Neill, Paul; Hanna, Louise; Williams, E Mark

    2015-06-01

    Breath analysis in respiratory disease is a non-invasive technique which has the potential to complement or replace current screening and diagnostic techniques without inconvenience or harm to the patient. Recent advances in ion mobility spectrometry (IMS) have allowed exhaled breath to be analysed rapidly, reliably and robustly thereby facilitating larger studies of exhaled breath profiles in clinical environments. Preliminary studies have demonstrated that volatile organic compound (VOC) breath profiles of people with respiratory disease can be distinguished from healthy control groups but there is a need to validate, standardise and ensure comparability between laboratories before real-time breath analysis becomes a clinical reality. It is also important that breath sampling procedures and methodologies are developed in conjunction with clinicians and the practicalities of working within the clinical setting are considered to allow the full diagnostic potential of these techniques to be realised. A protocol is presented, which has been developed over three years and successfully deployed for quickly and accurately collecting breath samples from 323 respiratory patients recruited from 10 different secondary health care clinics. PMID:25971863

  1. Farmers' Preferences for Methods of Receiving Information on New or Innovative Farming Practices.

    ERIC Educational Resources Information Center

    Riesenberg, Lou E.; Gor, Christopher Obel

    1989-01-01

    Survey of 386 Idaho farmers (response rate 58 percent) identified preferred methods of receiving information on new or innovative farming practices. Analysis revealed preference for interpersonal methods (demonstrations, tours, and field trips) over mass media such as computer-assisted instruction (CAI) and home study, although younger farmers,…

  2. Traditional Methods for Mineral Analysis

    NASA Astrophysics Data System (ADS)

    Ward, Robert E.; Carpenter, Charles E.

    This chapter describes traditional methods for analysis of minerals involving titrimetric and colorimetric procedures, and the use of ion selective electrodes. Other traditional methods of mineral analysis include gravimetric titration (i.e., insoluble forms of minerals are precipitated, rinse, dried, and weighed) and redox reactions (i.e., mineral is part of an oxidation-reduction reaction, and product is quantitated). However, these latter two methods will not be covered because they currently are used little in the food industry. The traditional methods that will be described have maintained widespread usage in the food industry despite the development of more modern instrumentation such as atomic absorption spectroscopy and inductively coupled plasma-atomic emission spectroscopy (Chap. 24). Traditional methods generally require chemicals and equipment that are routinely available in an analytical laboratory and are within the experience of most laboratory technicians. Additionally, traditional methods often form the basis for rapid analysis kits (e.g., Quantab®; for salt determination) that are increasingly in demand. Procedures for analysis of minerals of major nutritional or food processing concern are used for illustrative purposes. For additional examples of traditional methods refer to references (1-6). Slight modifications of these traditional methods are often needed for specific foodstuffs to minimize interferences or to be in the range of analytical performance. For analytical requirements for specific foods see the Official Methods of Analysis of AOAC International (5) and related official methods (6).

  3. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  4. Toward a practical approach for ergodicity analysis

    NASA Astrophysics Data System (ADS)

    Wang, H.; Wang, C.; Zhao, Y.; Lin, X.; Yu, C.

    2015-09-01

    It is of importance to perform hydrological forecast using a finite hydrological time series. Most time series analysis approaches presume a data series to be ergodic without justifying this assumption. This paper presents a practical approach to analyze the mean ergodic property of hydrological processes by means of autocorrelation function evaluation and Augmented Dickey Fuller test, a radial basis function neural network, and the definition of mean ergodicity. The mean ergodicity of precipitation processes at the Lanzhou Rain Gauge Station in the Yellow River basin, the Ankang Rain Gauge Station in Han River, both in China, and at Newberry, MI, USA are analyzed using the proposed approach. The results indicate that the precipitations of March, July, and August in Lanzhou, and of May, June, and August in Ankang have mean ergodicity, whereas, the precipitation of any other calendar month in these two rain gauge stations do not have mean ergodicity. The precipitation of February, May, July, and December in Newberry show ergodic property, although the precipitation of each month shows a clear increasing or decreasing trend.

  5. Practical Teaching Methods K-6: Sparking the Flame of Learning.

    ERIC Educational Resources Information Center

    Wilkinson, Pamela Fannin.; McNutt, Margaret A.; Friedman, Esther S.

    This book provides state-of-the-art teaching practices and methods, discussing the elements of good teaching in the content areas and including examples from real classrooms and library media centers. Chapters offer reflection exercises, assessment tips specific to each curriculum, and resource lists. Nine chapters examine: (1) "The Premise"…

  6. Retrieval practice can eliminate list method directed forgetting.

    PubMed

    Abel, Magdalena; Bäuml, Karl-Heinz T

    2016-01-01

    It has recently been shown that retrieval practice can reduce memories' susceptibility to interference, like retroactive and proactive interference. In this study, we therefore examined whether retrieval practice can also reduce list method directed forgetting, a form of intentional forgetting that presupposes interference. In each of two experiments, subjects successively studied two lists of items. After studying each single list, subjects restudied the list items to enhance learning, or they were asked to recall the items. Following restudy or retrieval practice of list 1 items, subjects were cued to either forget the list or remember it for an upcoming final test. Experiment 1 employed a free-recall and Experiment 1 a cued-recall procedure on the final memory test. In both experiments, directed forgetting was present in the restudy condition but was absent in the retrieval-practice condition, indicating that retrieval practice can reduce or even eliminate this form of forgetting. The results are consistent with the view that retrieval practice enhances list segregation processes. Such processes may reduce interference between lists and thus reduce directed forgetting. PMID:26286882

  7. Practicing the practice: Learning to guide elementary science discussions in a practice-oriented science methods course

    NASA Astrophysics Data System (ADS)

    Shah, Ashima Mathur

    University methods courses are often criticized for telling pre-service teachers, or interns, about the theories behind teaching instead of preparing them to actually enact teaching. Shifting teacher education to be more "practice-oriented," or to focus more explicitly on the work of teaching, is a current trend for re-designing the way we prepare teachers. This dissertation addresses the current need for research that unpacks the shift to more practice-oriented approaches by studying the content and pedagogical approaches in a practice-oriented, masters-level elementary science methods course (n=42 interns). The course focused on preparing interns to guide science classroom discussions. Qualitative data, such as video records of course activities and interns' written reflections, were collected across eight course sessions. Codes were applied at the sentence and paragraph level and then grouped into themes. Five content themes were identified: foregrounding student ideas and questions, steering discussion toward intended learning goals, supporting students to do the cognitive work, enacting teacher role of facilitator, and creating a classroom culture for science discussions. Three pedagogical approach themes were identified. First, the teacher educators created images of science discussions by modeling and showing videos of this practice. They also provided focused teaching experiences by helping interns practice the interactive aspects of teaching both in the methods classroom and with smaller groups of elementary students in schools. Finally, they structured the planning and debriefing phases of teaching so interns could learn from their teaching experiences and prepare well for future experiences. The findings were analyzed through the lens of Grossman and colleagues' framework for teaching practice (2009) to reveal how the pedagogical approaches decomposed, represented, and approximated practice throughout course activities. Also, the teacher educators

  8. Genre Analysis, ESP and Professional Practice

    ERIC Educational Resources Information Center

    Bhatia, Vijay K.

    2008-01-01

    Studies of professional genres and professional practices are invariably seen as complementing each other, in that they not only influence each other but are often co-constructed in specific professional contexts. However, professional genres have often been analyzed in isolation, leaving the study of professional practice almost completely out,…

  9. Compassion fatigue within nursing practice: a concept analysis.

    PubMed

    Coetzee, Siedine Knobloch; Klopper, Hester C

    2010-06-01

    "Compassion fatigue" was first introduced in relation to the study of burnout among nurses, but it was never defined within this context; it has since been adopted as a synonym for secondary traumatic stress disorder, which is far removed from the original meaning of the term. The aim of the study was to define compassion fatigue within nursing practice. The method that was used in this article was concept analysis. The findings revealed several categories of compassion fatigue: risk factors, causes, process, and manifestations. The characteristics of each of these categories are specified and a connotative (theoretical) definition, model case, additional cases, empirical indicators, and a denotative (operational) definition are provided. Compassion fatigue progresses from a state of compassion discomfort to compassion stress and, finally, to compassion fatigue, which if not effaced in its early stages of compassion discomfort or compassion stress, can permanently alter the compassionate ability of the nurse. Recommendations for nursing practice, education, and research are discussed. PMID:20602697

  10. Model-Based Practice Analysis and Test Specifications.

    ERIC Educational Resources Information Center

    Kane, Michael

    1997-01-01

    Licensure and certification decisions are usually based on a chain of inference from results of a practice analysis to test specifications, the test, examinee performance, and a pass-fail decision. This article focuses on the design of practice analyses and translation of practice analyses results into test specifications. (SLD)

  11. Practical postcalibration uncertainty analysis: Yucca Mountain, Nevada.

    PubMed

    James, Scott C; Doherty, John E; Eddebbarh, Al-Aziz

    2009-01-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is "calibrated." Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the United States' proposed site for disposal of high-level radioactive waste. Linear and nonlinear uncertainty analyses are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction's uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This article applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. PMID:19744249

  12. Methods for genetic linkage analysis using trisomies

    SciTech Connect

    Feingold, E.; Lamb, N.E.; Sherman, S.L.

    1995-02-01

    Certain genetic disorders are rare in the general population, but more common in individuals with specific trisomies. Examples of this include leukemia and duodenal atresia in trisomy 21. This paper presents a linkage analysis method for using trisomic individuals to map genes for such traits. It is based on a very general gene-specific dosage model that posits that the trait is caused by specific effects of different alleles at one or a few loci and that duplicate copies of {open_quotes}susceptibility{close_quotes} alleles inherited from the nondisjoining parent give increased likelihood of having the trait. Our mapping method is similar to identity-by-descent-based mapping methods using affected relative pairs and also to methods for mapping recessive traits using inbred individuals by looking for markers with greater than expected homozygosity by descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected homozygosity in the chromosomes inherited from the nondisjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the trait gene, a confidence interval for that distance, and methods for computing power and sample sizes. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers and how to test candidate genes. 20 refs., 5 figs., 1 tab.

  13. Practical methods to improve the development of computational software

    SciTech Connect

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-07-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  14. Practical Nursing. Ohio's Competency Analysis Profile.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Vocational Instructional Materials Lab.

    Developed through a modified DACUM (Developing a Curriculum) process involving business, industry, labor, and community agency representatives in Ohio, this document is a comprehensive and verified employer competency profile for practical nursing. The list contains units (with and without subunits), competencies, and competency builders that…

  15. [Embryo vitrification: French clinical practice analysis for BLEFCO].

    PubMed

    Hesters, L; Achour-Frydman, N; Mandelbaum, J; Levy, R

    2013-09-01

    Frozen thawed embryo transfer is currently an important part of present-day assisted reproductive technology (ART) aiming at increasing the clinical pregnancy rate per oocyte retrieval. Although slow freezing method has been the reference during 2 decades, the recent years witnessed an expansion of ultrarapid cryopreservation method named vitrification. Recently in France, vitrification has been authorized for cryopreserving human embryos. Therefore BLEFCO consortium decides to perform a descriptive study through questionnaires to evaluate the state of vitrification in the French clinical practice. Questionnaires were addressed to the 105 French centres of reproductive biology and 60 were fully completed. Data analysis revealed that embryo survival rate as well as, clinical pregnancy rate were increased after vitrification technology when compared to slow freezing procedure. Overall, these preliminary data suggest that vitrification may improve ART outcomes through an increasing of the cumulative pregnancy rate per oocyte retrieval. PMID:23962680

  16. Method of multivariate spectral analysis

    DOEpatents

    Keenan, Michael R.; Kotula, Paul G.

    2004-01-06

    A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).

  17. Practical analysis of welding processes using finite element analysis.

    SciTech Connect

    Cowles, J. H.; Dave, V. R.; Hartman, D. A.

    2001-01-01

    With advances in commercially available finite element software and computational capability, engineers can now model large-scale problems in mechanics, heat transfer, fluid flow, and electromagnetics as never before. With these enhancements in capability, it is increasingly tempting to include the fundamental process physics to help achieve greater accuracy (Refs. 1-7). While this goal is laudable, it adds complication and drives up cost and computational requirements. Practical analysis of welding relies on simplified user inputs to derive important relativistic trends in desired outputs such as residual stress or distortion due to changes in inputs like voltage, current, and travel speed. Welding is a complex three-dimensional phenomenon. The question becomes how much modeling detail is needed to accurately predict relative trends in distortion, residual stress, or weld cracking? In this work, a HAZ (Heat Affected Zone) weld-cracking problem was analyzed to rank two different welding cycles (weld speed varied) in terms of crack susceptibility. Figure 1 shows an aerospace casting GTA welded to a wrought skirt. The essentials of part geometry, welding process, and tooling were suitably captured lo model the strain excursion in the HAZ over a crack-susceptible temperature range, and the weld cycles were suitably ranked. The main contribution of this work is the demonstration of a practical methodology by which engineering solutions to engineering problems may be obtained through weld modeling when time and resources are extremely limited. Typically, welding analysis suffers with the following unknowns: material properties over entire temperature range, the heat-input source term, and environmental effects. Material properties of interest are conductivity, specific heat, latent heat, modulus, Poisson's ratio, yield strength, ultimate strength, and possible rate dependencies. Boundary conditions are conduction into fixturing, radiation and convection to the

  18. Semen analysis: its place in modern reproductive medical practice.

    PubMed

    McLachlan, Robert I; Baker, H W Gordon; Clarke, Gary N; Harrison, Keith L; Matson, Phillip L; Holden, Carol A; de Kretser, David M

    2003-02-01

    Semen analysis is the most important laboratory investigation for men when assessing the infertile couple. Advances in in vitro fertilisation (IVF) techniques, particularly intracytoplasmic sperm injection (ICSI) involving the direct injection of a single spermatozoon into an egg, have not diminished the role of semen analysis in modern reproductive practice. Semen analysis is the most basic laboratory investigation undertaken and is descriptive in terms of semen volume, appearance, viscosity, sperm concentration, sperm motility and morphology. Since the results are used by clinicians to choose appropriate treatment options, a reliable service is imperative. It is crucial that the laboratory is experienced in the performance of semen analyses to ensure an accurate result. To ensure a quality semen analysis service, laboratories must participate in internal and external quality assurance activities, incorporate rigorous training protocols for technical staff and use reliable procedures. The World Health Organization laboratory manual for the examination of human semen and sperm cervical mucous interaction, clearly describes the variables that need to be assessed and the methods of analysis and quality assurance to be used. PMID:12701680

  19. Method of photon spectral analysis

    DOEpatents

    Gehrke, Robert J.; Putnam, Marie H.; Killian, E. Wayne; Helmer, Richard G.; Kynaston, Ronnie L.; Goodwin, Scott G.; Johnson, Larry O.

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  20. Method of photon spectral analysis

    DOEpatents

    Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

    1993-04-27

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

  1. A practical method to evaluate radiofrequency exposure of mast workers.

    PubMed

    Alanko, Tommi; Hietanen, Maila

    2008-01-01

    Assessment of occupational exposure to radiofrequency (RF) fields in telecommunication transmitter masts is a challenging task. For conventional field strength measurements using manually operated instruments, it is difficult to document the locations of measurements while climbing up a mast. Logging RF dosemeters worn by the workers, on the other hand, do not give any information about the location of the exposure. In this study, a practical method was developed and applied to assess mast workers' exposure to RF fields and the corresponding location. This method uses a logging dosemeter for personal RF exposure evaluation and two logging barometers to determine the corresponding height of the worker's position on the mast. The procedure is not intended to be used for compliance assessments, but to indicate locations where stricter assessments are needed. The applicability of the method is demonstrated by making measurements in a TV and radio transmitting mast. PMID:19054796

  2. Landscape analysis: Theoretical considerations and practical needs

    USGS Publications Warehouse

    Godfrey, A.E.; Cleaves, E.T.

    1991-01-01

    Numerous systems of land classification have been proposed. Most have led directly to or have been driven by an author's philosophy of earth-forming processes. However, the practical need of classifying land for planning and management purposes requires that a system lead to predictions of the results of management activities. We propose a landscape classification system composed of 11 units, from realm (a continental mass) to feature (a splash impression). The classification concerns physical aspects rather than economic or social factors; and aims to merge land inventory with dynamic processes. Landscape units are organized using a hierarchical system so that information may be assembled and communicated at different levels of scale and abstraction. Our classification uses a geomorphic systems approach that emphasizes the geologic-geomorphic attributes of the units. Realm, major division, province, and section are formulated by subdividing large units into smaller ones. For the larger units we have followed Fenneman's delineations, which are well established in the North American literature. Areas and districts are aggregated into regions and regions into sections. Units smaller than areas have, in practice, been subdivided into zones and smaller units if required. We developed the theoretical framework embodied in this classification from practical applications aimed at land use planning and land management in Maryland (eastern Piedmont Province near Baltimore) and Utah (eastern Uinta Mountains). ?? 1991 Springer-Verlag New York Inc.

  3. Landscape analysis: Theoretical considerations and practical needs

    NASA Astrophysics Data System (ADS)

    Godfrey, Andrew E.; Cleaves, Emery T.

    1991-03-01

    Numerous systems of land classification have been proposed. Most have led directly to or have been driven by an author's philosophy of earth-forming processes. However, the practical need of classifying land for planning and management purposes requires that a system lead to predictions of the results of management activities. We propose a landscape classification system composed of 11 units, from realm (a continental mass) to feature (a splash impression). The classification concerns physical aspects rather than economic or social factors; and aims to merge land inventory with dynamic processes. Landscape units are organized using a hierarchical system so that information may be assembled and communicated at different levels of scale and abstraction. Our classification uses a geomorphic systems approach that emphasizes the geologic-geomorphic attributes of the units. Realm, major division, province, and section are formulated by subdividing large units into smaller ones. For the larger units we have followed Fenneman's delineations, which are well established in the North American literature. Areas and districts are aggregated into regions and regions into sections. Units smaller than areas have, in practice, been subdivided into zones and smaller units if required. We developed the theoretical framework embodied in this classification from practical applications aimed at land use planning and land management in Maryland (eastern Piedmont Province near Baltimore) and Utah (eastern Uinta Mountains).

  4. Monte Carlo methods in genetic analysis

    SciTech Connect

    Lin, Shili

    1996-12-31

    Many genetic analyses require computation of probabilities and likelihoods of pedigree data. With more and more genetic marker data deriving from new DNA technologies becoming available to researchers, exact computations are often formidable with standard statistical methods and computational algorithms. The desire to utilize as much available data as possible, coupled with complexities of realistic genetic models, push traditional approaches to their limits. These methods encounter severe methodological and computational challenges, even with the aid of advanced computing technology. Monte Carlo methods are therefore increasingly being explored as practical techniques for estimating these probabilities and likelihoods. This paper reviews the basic elements of the Markov chain Monte Carlo method and the method of sequential imputation, with an emphasis upon their applicability to genetic analysis. Three areas of applications are presented to demonstrate the versatility of Markov chain Monte Carlo for different types of genetic problems. A multilocus linkage analysis example is also presented to illustrate the sequential imputation method. Finally, important statistical issues of Markov chain Monte Carlo and sequential imputation, some of which are unique to genetic data, are discussed, and current solutions are outlined. 72 refs.

  5. Gaussian Weighted Trajectory Method. IV. No Rainbow Effect in Practice

    NASA Astrophysics Data System (ADS)

    Bonnet, L.

    2009-04-01

    The Gaussian weighted trajectory method (GWTM) is a practical implementation of classical S matrix theory (CSMT) in the random phase approximation, CSMT being the first and simplest semi-classical approach of molecular collisions, developped in the early seventies. Though very close in spirit to the purely classical description, GWTM accounts to some extent for the quantization of the different degrees-of-freedom involved in the processes. While CSMT may give diverging final state distributions, in relation to the rainbow effect of elastic scattering theory, GWTM has never led to such a mathematical catastrophe. The goal of the present note is to explain this finding.

  6. Development of a practical costing method for hospitals.

    PubMed

    Cao, Pengyu; Toyabe, Shin-Ichi; Akazawa, Kouhei

    2006-03-01

    To realize an effective cost control, a practical and accurate cost accounting system is indispensable in hospitals. In traditional cost accounting systems, the volume-based costing (VBC) is the most popular cost accounting method. In this method, the indirect costs are allocated to each cost object (services or units of a hospital) using a single indicator named a cost driver (e.g., Labor hours, revenues or the number of patients). However, this method often results in rough and inaccurate results. The activity based costing (ABC) method introduced in the mid 1990s can prove more accurate results. With the ABC method, all events or transactions that cause costs are recognized as "activities", and a specific cost driver is prepared for each activity. Finally, the costs of activities are allocated to cost objects by the corresponding cost driver. However, it is much more complex and costly than other traditional cost accounting methods because the data collection for cost drivers is not always easy. In this study, we developed a simplified ABC (S-ABC) costing method to reduce the workload of ABC costing by reducing the number of cost drivers used in the ABC method. Using the S-ABC method, we estimated the cost of the laboratory tests, and as a result, similarly accurate results were obtained with the ABC method (largest difference was 2.64%). Simultaneously, this new method reduces the seven cost drivers used in the ABC method to four. Moreover, we performed an evaluation using other sample data from physiological laboratory department to certify the effectiveness of this new method. In conclusion, the S-ABC method provides two advantages in comparison to the VBC and ABC methods: (1) it can obtain accurate results, and (2) it is simpler to perform. Once we reduce the number of cost drivers by applying the proposed S-ABC method to the data for the ABC method, we can easily perform the cost accounting using few cost drivers after the second round of costing. PMID

  7. Methods for genetic linkage analysis using trisomies

    SciTech Connect

    Feingold, E.; Lamb, N.E.; Sherman, S.L.

    1994-09-01

    Certain genetic disorders (e.g. congenital cataracts, duodenal atresia) are rare in the general population, but more common in people with Down`s syndrome. We present a method for using individuals with trisomy 21 to map genes for such traits. Our methods are analogous to methods for mapping autosomal dominant traits using affected relative pairs by looking for markers with greater than expected identity-by-descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected reduction to homozygosity in the chromosomes inherited form the non-disjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the gene, a confidence interval for that distance, and methods for computing power and sample sizes. The methods are described in the context of gene-dosage model for the etiology of the disorder, but can be extended to other models. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers, how to test candidate genes, and how to handle the effect of reduced recombination associated with maternal meiosis I non-disjunction.

  8. Effective methods for disseminating research findings to nurses in practice.

    PubMed

    Cronenwett, L R

    1995-09-01

    Professionals in all disciplines are challenged by the proliferation of new knowledge. Nurses, too, must find cost-effective ways of ensuring that their patients are benefiting from the most current knowledge about health and illness. The methods of research dissemination to clinicians described in this article are presumed to be effective because of anecdotal reports, conference evaluations, or clinician surveys. The profession needs more sophisticated evaluations of the effectiveness of various dissemination methods. In the meantime, whether you are a researcher, an administrator, an educator, or a clinician, you have a role to play in improving research dissemination. Implement just one strategy from this article and evaluate the results. Each contribution moves nursing toward research-based practice. PMID:7567569

  9. The 1999 IDEA Regulations: A Practical Analysis.

    ERIC Educational Resources Information Center

    Borreca, Christopher P.; Goldman, Teri B.; Horton, Janet L.; Mehfoud, Kathleen; Rodick, Bennett; Weatherly, Julie J.; Wenkart, Ronald D.; Wynn, Deryl W.

    This publication explains how some of the more significant 1999 regulations of the Individuals with Disabilities Education Act will affect schools providing services to children in need of special education. The analysis tracks the regulatory format used to organize the rules under Title 34 of the Code of Federal Regulations. The selected…

  10. Comparative Lifecycle Energy Analysis: Theory and Practice.

    ERIC Educational Resources Information Center

    Morris, Jeffrey; Canzoneri, Diana

    1992-01-01

    Explores the position that more energy is conserved through recycling secondary materials than is generated from municipal solid waste incineration. Discusses one component of a lifecycle analysis--a comparison of energy requirements for manufacturing competing products. Includes methodological issues, energy cost estimates, and difficulties…

  11. Translational Behavior Analysis and Practical Benefits

    ERIC Educational Resources Information Center

    Pilgrim, Carol

    2011-01-01

    In his article, Critchfield ("Translational Contributions of the Experimental Analysis of Behavior," "The Behavior Analyst," v34, p3-17, 2011) summarizes a previous call (Mace & Critchfield, 2010) for basic scientists to reexamine the inspiration for their research and turn increasingly to translational approaches. Interestingly, rather than…

  12. Voltametric analysis apparatus and method

    DOEpatents

    Almon, Amy C.

    1993-01-01

    An apparatus and method for electrochemical analysis of elements in solution. An auxiliary electrode 14, a reference electrode 18, and five working electrodes 20, 22, 26, 28, and 30 are positioned in a container 12 containing a sample solution 34. The working electrodes are spaced apart evenly from each other and auxiliary electrode 14 to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between auxiliary electrode 14 and each of the working electrodes 20, 22, 26, 28, and 30. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in sample solution 34 and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  13. Voltammetric analysis apparatus and method

    DOEpatents

    Almon, A.C.

    1993-06-08

    An apparatus and method is described for electrochemical analysis of elements in solution. An auxiliary electrode, a reference electrode, and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and the auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between the auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in the sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  14. Correlation method of electrocardiogram analysis

    NASA Astrophysics Data System (ADS)

    Strinadko, Marina M.; Timochko, Katerina B.

    2002-02-01

    The electrocardiograph method is the informational source for functional heart state characteristics. The electrocardiogram parameters are the integrated map of many component characteristics of the heart system and depend on disturbance requirements of each device. In the research work the attempt of making the skeleton diagram of perturbation of the heart system is made by the characteristic description of its basic components and connections between them through transition functions, which are written down by the differential equations of the first and second order with the purpose to build-up and analyze electrocardiogram. Noting the vector character of perturbation and the various position of heart in each organism, we offer own coordinate system connected with heart. The comparative analysis of electrocardiogram was conducted with the usage of correlation method.

  15. Progress testing: critical analysis and suggested practices.

    PubMed

    Albanese, Mark; Case, Susan M

    2016-03-01

    Educators have long lamented the tendency of students to engage in rote memorization in preparation for tests rather than engaging in deep learning where they attempt to gain meaning from their studies. Rote memorization driven by objective exams has been termed a steering effect. Progress testing (PT), in which a comprehensive examination sampling all of medicine is administered repeatedly throughout the entire curriculum, was developed with the stated aim of breaking the steering effect of examinations and of promoting deep learning. PT is an approach historically linked to problem-based learning (PBL) although there is a growing recognition of its applicability more broadly. The purpose of this article is to summarize the salient features of PT drawn from the literature, provide a critical review of these features based upon the same literature and psychometric considerations drawn from the Standards for Educational and Psychological Testing and provide considerations of what should be part of best practices in applying PT from an evidence-based and a psychometric perspective. PMID:25662873

  16. Comparison of four teaching methods on Evidence-based Practice skills of postgraduate nursing students.

    PubMed

    Fernandez, Ritin S; Tran, Duong Thuy; Ramjan, Lucie; Ho, Carey; Gill, Betty

    2014-01-01

    The aim of this study was to compare four teaching methods on the evidence-based practice knowledge and skills of postgraduate nursing students. Students enrolled in the Evidence-based Nursing (EBN) unit in Australia and Hong Kong in 2010 and 2011 received education via either the standard distance teaching method, computer laboratory teaching method, Evidence-based Practice-Digital Video Disc (EBP-DVD) teaching method or the didactic classroom teaching method. Evidence-based Practice (EBP) knowledge and skills were evaluated using student assignments that comprised validated instruments. One-way analysis of covariance was implemented to assess group differences on outcomes after controlling for the effects of age and grade point average (GPA). Data were obtained from 187 students. The crude mean score among students receiving the standard+DVD method of instruction was higher for developing a precise clinical question (8.1±0.8) and identifying the level of evidence (4.6±0.7) compared to those receiving other teaching methods. These differences were statistically significant after controlling for age and grade point average. Significant improvement in cognitive and technical EBP skills can be achieved for postgraduate nursing students by integrating a DVD as part of the EBP teaching resources. The EBP-DVD is an easy teaching method to improve student learning outcomes and ensure that external students receive equivalent and quality learning experiences. PMID:23107585

  17. SAR/QSAR methods in public health practice

    SciTech Connect

    Demchuk, Eugene Ruiz, Patricia; Chou, Selene; Fowler, Bruce A.

    2011-07-15

    Methods of (Quantitative) Structure-Activity Relationship ((Q)SAR) modeling play an important and active role in ATSDR programs in support of the Agency mission to protect human populations from exposure to environmental contaminants. They are used for cross-chemical extrapolation to complement the traditional toxicological approach when chemical-specific information is unavailable. SAR and QSAR methods are used to investigate adverse health effects and exposure levels, bioavailability, and pharmacokinetic properties of hazardous chemical compounds. They are applied as a part of an integrated systematic approach in the development of Health Guidance Values (HGVs), such as ATSDR Minimal Risk Levels, which are used to protect populations exposed to toxic chemicals at hazardous waste sites. (Q)SAR analyses are incorporated into ATSDR documents (such as the toxicological profiles and chemical-specific health consultations) to support environmental health assessments, prioritization of environmental chemical hazards, and to improve study design, when filling the priority data needs (PDNs) as mandated by Congress, in instances when experimental information is insufficient. These cases are illustrated by several examples, which explain how ATSDR applies (Q)SAR methods in public health practice.

  18. A practical approach for linearity assessment of calibration curves under the International Union of Pure and Applied Chemistry (IUPAC) guidelines for an in-house validation of method of analysis.

    PubMed

    Sanagi, M Marsin; Nasir, Zalilah; Ling, Susie Lu; Hermawan, Dadan; Ibrahim, Wan Aini Wan; Naim, Ahmedy Abu

    2010-01-01

    Linearity assessment as required in method validation has always been subject to different interpretations and definitions by various guidelines and protocols. However, there are very limited applicable implementation procedures that can be followed by a laboratory chemist in assessing linearity. Thus, this work proposes a simple method for linearity assessment in method validation by a regression analysis that covers experimental design, estimation of the parameters, outlier treatment, and evaluation of the assumptions according to the International Union of Pure and Applied Chemistry guidelines. The suitability of this procedure was demonstrated by its application to an in-house validation for the determination of plasticizers in plastic food packaging by GC. PMID:20922968

  19. Assessing methods for measurement of clinical outcomes and quality of care in primary care practices

    PubMed Central

    2012-01-01

    Purpose To evaluate the appropriateness of potential data sources for the population of performance indicators for primary care (PC) practices. Methods This project was a cross sectional study of 7 multidisciplinary primary care teams in Ontario, Canada. Practices were recruited and 5-7 physicians per practice agreed to participate in the study. Patients of participating physicians (20-30) were recruited sequentially as they presented to attend a visit. Data collection included patient, provider and practice surveys, chart abstraction and linkage to administrative data sets. Matched pairs analysis was used to examine the differences in the observed results for each indicator obtained using multiple data sources. Results Seven teams, 41 physicians, 94 associated staff and 998 patients were recruited. The survey response rate was 81% for patients, 93% for physicians and 83% for associated staff. Chart audits were successfully completed on all but 1 patient and linkage to administrative data was successful for all subjects. There were significant differences noted between the data collection methods for many measures. No single method of data collection was best for all outcomes. For most measures of technical quality of care chart audit was the most accurate method of data collection. Patient surveys were more accurate for immunizations, chronic disease advice/information dispensed, some general health promotion items and possibly for medication use. Administrative data appears useful for indicators including chronic disease diagnosis and osteoporosis/ breast screening. Conclusions Multiple data collection methods are required for a comprehensive assessment of performance in primary care practices. The choice of which methods are best for any one particular study or quality improvement initiative requires careful consideration of the biases that each method might introduce into the results. In this study, both patients and providers were willing to participate in and

  20. Methods for Cancer Epigenome Analysis

    PubMed Central

    Nagarajan, Raman P.; Fouse, Shaun D.; Bell, Robert J.A.; Costello, Joseph F.

    2014-01-01

    Accurate detection of epimutations in tumor cells is crucial for understanding the molecular pathogenesis of cancer. Alterations in DNA methylation in cancer are functionally important and clinically relevant, but even this well-studied area is continually re-evaluated in light of unanticipated results, including a strong connection between aberrant DNA methylation in adult tumors and polycomb group profiles in embryonic stem cells, cancer-associated genetic mutations in epigenetic regulators such as DNMT3A and TET family genes, and the discovery of abundant 5-hydroxymethylcytosine, a product of TET proteins acting on 5-methylcytosine, in human tissues. The abundance and distribution of covalent histone modifications in primary cancer tissues relative to normal cells is a largely uncharted area, although there is good evidence for a mechanistic role of cancer-specific alterations in epigenetic marks in tumor etiology, drug response and tumor progression. Meanwhile, the discovery of new epigenetic marks continues, and there are many useful methods for epigenome analysis applicable to primary tumor samples, in addition to cancer cell lines. For DNA methylation and hydroxymethylation, next-generation sequencing allows increasingly inexpensive and quantitative whole-genome profiling. Similarly, the refinement and maturation of chromatin immunoprecipitation with next-generation sequencing (ChIP-seq) has made possible genome-wide mapping of histone modifications, open chromatin and transcription factor binding sites. Computational tools have been developed apace with these epigenome methods to better enable the accuracy and interpretation of the data from the profiling methods. PMID:22956508

  1. Practical application of fault tree analysis

    SciTech Connect

    Prugh, R.W.

    1980-01-01

    A detailed survey of standard and novel approaches to Fault Tree construction, based on recent developments at Du Pont, covers the effect-to-cause procedure for control systems as in process plants; the effect-to-cause procedure for processes; source-of-hazard analysis, as in pressure vessel rupture; use of the ''fire triangle'' in a Fault Tree; critical combinations of safeguard failures; action points for automatic or operator control of a process; situations involving hazardous reactant ratios; failure-initiating and failure-enabling events and intervention by the operator; ''daisy-chain'' hazards, e.g., in batch processes and ship accidents; combining batch and continuous operations in a Fault Tree; possible future structure-development procedures for fault-tree construction; and the use of quantitative results (calculated frequencies of Top-Event occurrence) to restructure the Fault Tree after improving the process to any acceptable risk level.

  2. Flow analysis system and method

    NASA Technical Reports Server (NTRS)

    Hill, Wayne S. (Inventor); Barck, Bruce N. (Inventor)

    1998-01-01

    A non-invasive flow analysis system and method wherein a sensor, such as an acoustic sensor, is coupled to a conduit for transmitting a signal which varies depending on the characteristics of the flow in the conduit. The signal is amplified and there is a filter, responsive to the sensor signal, and tuned to pass a narrow band of frequencies proximate the resonant frequency of the sensor. A demodulator generates an amplitude envelope of the filtered signal and a number of flow indicator quantities are calculated based on variations in amplitude of the amplitude envelope. A neural network, or its equivalent, is then used to determine the flow rate of the flow in the conduit based on the flow indicator quantities.

  3. Reducing alcohol consumption. Comparing three brief methods in family practice.

    PubMed Central

    McIntosh, M. C.; Leigh, G.; Baldwin, N. J.; Marmulak, J.

    1997-01-01

    OBJECTIVE: To compare the effects of three brief methods of reducing alcohol consumption among family practice patients. DESIGN: Patients randomly assigned to one of three interventions were assessed initially and at 3-, 6-, and 12-month follow-up appointments. SETTING: Family practice clinic composed of 12 primary care physicians seeing approximately 6000 adults monthly in a small urban community, population 40,000. PARTICIPANTS: Through a screening questionnaire, 134 men and 131 women were identified as hazardous drinkers (five or more drinks at least once monthly) during an 11-month screening of 1420 patients. Of 265 patients approached, 180 agreed to participate and 159 (83 men and 76 women) actually participated in the study. INTERVENTIONS: Three interventions were studied: brief physician advice (5 minutes), two 30-minute sessions with a physician using cognitive behavioural strategies or two 30-minute sessions with a nurse practitioner using identical strategies. MAIN OUTCOME MEASURES: Quantity and frequency (QF) of drinking were used to assess reduction in hazardous drinking and problems related to drinking over 12 months of follow up. RESULTS: No statistical difference between groups was found. The QF of monthly drinking was reduced overall by 66% (among men) and 74% (among women) for those reporting at least one hazardous drinking day weekly at assessment (N = 96). Men reported drinking significantly more than women. CONCLUSIONS: These results indicated that offering brief, specific advice can motivate patients to reduce their alcohol intake. There was no difference in effect between brief advice from their own physician or brief intervention by a physician or a nurse. PMID:9386883

  4. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  5. Trends in sensitivity analysis practice in the last decade.

    PubMed

    Ferretti, Federico; Saltelli, Andrea; Tarantola, Stefano

    2016-10-15

    The majority of published sensitivity analyses (SAs) are either local or one factor-at-a-time (OAT) analyses, relying on unjustified assumptions of model linearity and additivity. Global approaches to sensitivity analyses (GSA) which would obviate these shortcomings, are applied by a minority of researchers. By reviewing the academic literature on SA, we here present a bibliometric analysis of the trends of different SA practices in last decade. The review has been conducted both on some top ranking journals (Nature and Science) and through an extended analysis in the Elsevier's Scopus database of scientific publications. After correcting for the global growth in publications, the amount of papers performing a generic SA has notably increased over the last decade. Even if OAT is still the most largely used technique in SA, there is a clear increase in the use of GSA with preference respectively for regression and variance-based techniques. Even after adjusting for the growth of publications in the sole modelling field, to which SA and GSA normally apply, the trend is confirmed. Data about regions of origin and discipline are also briefly discussed. The results above are confirmed when zooming on the sole articles published in chemical modelling, a field historically proficient in the use of SA methods. PMID:26934843

  6. Diagnostic Methods for Bile Acid Malabsorption in Clinical Practice

    PubMed Central

    Vijayvargiya, Priya; Camilleri, Michael; Shin, Andrea; Saenger, Amy

    2013-01-01

    Altered bile acid (BA) concentrations in the colon may cause diarrhea or constipation. BA malabsorption (BAM) accounts for >25% of patients with irritable bowel syndrome (IBS) with diarrhea and chronic diarrhea in Western countries. As BAM is increasingly recognized, proper diagnostic methods are desired in clinical practice to help direct the most effective treatment course for the chronic bowel dysfunction. This review appraises the methodology, advantages and disadvantages of 4 tools that directly measure BAM: 14C-glycocholate breath and stool test, 75Selenium HomotauroCholic Acid Test (SeHCAT), 7 α-hydroxy-4-cholesten-3-one (C4) and fecal BAs. 14C-glycocholate is a laborious test no longer widely utilized. 75SeHCAT is validated, but not available in the United States. Serum C4 is a simple, accurate method that is applicable to a majority of patients, but requires further clinical validation. Fecal measurements to quantify total and individual fecal BAs are technically cumbersome and not widely available. Regrettably, none of these tests are routinely available in the U.S., and a therapeutic trial with a BA binder is used as a surrogate for diagnosis of BAM. Recent data suggest there is an advantage to studying fecal excretion of the individual BAs and their role in BAM; this may constitute a significant advantage of the fecal BA method over the other tests. Fecal BA test could become a routine addition to fecal fat measurement in patients with unexplained diarrhea. In summary, availability determines the choice of test among C4, SeHCAT and fecal BA; more widespread availability of such tests would enhance clinical management of these patients. PMID:23644387

  7. Extended morphological processing: a practical method for automatic spot detection of biological markers from microscopic images

    PubMed Central

    2010-01-01

    Background A reliable extraction technique for resolving multiple spots in light or electron microscopic images is essential in investigations of the spatial distribution and dynamics of specific proteins inside cells and tissues. Currently, automatic spot extraction and characterization in complex microscopic images poses many challenges to conventional image processing methods. Results A new method to extract closely located, small target spots from biological images is proposed. This method starts with a simple but practical operation based on the extended morphological top-hat transformation to subtract an uneven background. The core of our novel approach is the following: first, the original image is rotated in an arbitrary direction and each rotated image is opened with a single straight line-segment structuring element. Second, the opened images are unified and then subtracted from the original image. To evaluate these procedures, model images of simulated spots with closely located targets were created and the efficacy of our method was compared to that of conventional morphological filtering methods. The results showed the better performance of our method. The spots of real microscope images can be quantified to confirm that the method is applicable in a given practice. Conclusions Our method achieved effective spot extraction under various image conditions, including aggregated target spots, poor signal-to-noise ratio, and large variations in the background intensity. Furthermore, it has no restrictions with respect to the shape of the extracted spots. The features of our method allow its broad application in biological and biomedical image information analysis. PMID:20615231

  8. [Methodical approaches to usage of complex anthropometric methods in clinical practice].

    PubMed

    Bukavneva, N S; Pozdniakov, A L; Nikitiuk, D B

    2007-01-01

    The new methodical approach of complex anthropometric study in clinical practice has been proposed for evaluation of nutritional state, dyagnostics and effectiveness of dietotherapy of patients with alimentary-depended pathology. The technique of body's voluminous size measurements, adipose folds measurements by means of caliper, extremities diameter measurements has been described, which would allow to receive more precise data during patients examinations. Formulas which allow to calculate the amount of bone, muscular and adipose mass been provided. PMID:18219935

  9. Practical Framework: Implementing OEE Method in Manufacturing Process Environment

    NASA Astrophysics Data System (ADS)

    Maideen, N. C.; Sahudin, S.; Mohd Yahya, N. H.; Norliawati, A. O.

    2016-02-01

    Manufacturing process environment requires reliable machineries in order to be able to satisfy the market demand. Ideally, a reliable machine is expected to be operated and produce a quality product at its maximum designed capability. However, due to some reason, the machine usually unable to achieved the desired performance. Since the performance will affect the productivity of the system, a measurement technique should be applied. Overall Equipment Effectiveness (OEE) is a good method to measure the performance of the machine. The reliable result produced from OEE can then be used to propose a suitable corrective action. There are a lot of published paper mentioned about the purpose and benefit of OEE that covers what and why factors. However, the how factor not yet been revealed especially the implementation of OEE in manufacturing process environment. Thus, this paper presents a practical framework to implement OEE and a case study has been discussed to explain in detail each steps proposed. The proposed framework is beneficial to the engineer especially the beginner to start measure their machine performance and later improve the performance of the machine.

  10. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  11. Practical Method for Transient Stability with Unbalanced Condition based on Symmetric Coordinates

    NASA Astrophysics Data System (ADS)

    Fujiwara, Shuhei; Kono, Yoshiyuki; Kitayama, Masashi; Goda, Tadahiro

    The symmetric coordinates are very popular method to model unbalanced faults in power system analysis. It is not only easy to handle with a single fault, but also it can be extended to multiple faults. But it is not easy to model situations that those unbalanced situation will continuously change, like a SVC (Static Var Compensator) with unbalanced fault in power system or an unbalanced nonlinear load. Under these situations, we propose a practical use of multiple fault calculation method based on symmetric coordinates that can handle with these kinds of unbalanced situations.

  12. Practical limitations of the slope assisted BOTDA method in dynamic strain sensing

    NASA Astrophysics Data System (ADS)

    Minardo, A.; Catalano, E.; Zeni, L.

    2016-05-01

    By analyzing the operation of the slope assisted Brillouin Optical Time-Domain Analysis (BOTDA) method, it comes out that the acquisition rate is practically limited by two fundamental factors: the polarization scrambling frequency and the phase noise from the laser. As regards polarization scrambling, we show experimentally that the scrambling frequency poses a limit on the maximum acquisition rate for a given averaging factor. As regards phase noise, we show numerically and experimentally that the slope assisted method is particularly sensitive to the laser phase noise, due to the specific positioning of the pump-probe frequency shift on the Brillouin Gain Spectrum (BGS).

  13. Systems analysis and design methodologies: practicalities and use in today's information systems development efforts.

    PubMed

    Jerva, M

    2001-05-01

    Historically, systems analysis and design methodologies have been used as a guide in software development. Such methods provide structure to software engineers in their efforts to create quality solutions in the real world of information systems. This article looks at the elements that constitute a systems analysis methodology and examines the historical development of systems analysis in software development. It concludes with observations on the strengths and weaknesses of four methodologies and the state of the art of practice today. PMID:11378979

  14. Probabilistic methods for structural response analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Burnside, O. H.; Cruse, T. A.

    1988-01-01

    This paper addresses current work to develop probabilistic structural analysis methods for integration with a specially developed probabilistic finite element code. The goal is to establish distribution functions for the structural responses of stochastic structures under uncertain loadings. Several probabilistic analysis methods are proposed covering efficient structural probabilistic analysis methods, correlated random variables, and response of linear system under stationary random loading.

  15. Report on the National Art Therapy Practice Analysis Survey.

    ERIC Educational Resources Information Center

    Knapp, Joan E.; And Others

    1994-01-01

    Art therapy practice analysis surveys were completed by 1,125 Registered Art Therapists. Most respondents were females, Caucasian, and graduates of master's degree programs in art therapy. Respondents rated "creating therapeutic environment" as most important major responsibility of entry-level as therapists. (NB)

  16. [Practice analysis: culture shock and adaptation at work].

    PubMed

    Philippe, Séverine; Didry, Pascale

    2015-12-01

    Constructed as a practice analysis, this personal account presents the reflection undertaken by a student on placement in Ireland thanks to the Erasmus programme. She describes in detail the stages of her adaptation in a hospital setting which is considerably different to her usual environment. PMID:26654501

  17. A Model of Practice in Special Education: Dynamic Ecological Analysis

    ERIC Educational Resources Information Center

    Hannant, Barbara; Lim, Eng Leong; McAllum, Ruth

    2010-01-01

    Dynamic Ecological Analysis (DEA) is a model of practice which increases a teams' efficacy by enabling the development of more effective interventions through collaboration and collective reflection. This process has proved to be useful in: a) clarifying thinking and problem-solving, b) transferring knowledge and thinking to significant parties,…

  18. Procedural Fidelity: An Analysis of Measurement and Reporting Practices

    ERIC Educational Resources Information Center

    Ledford, Jennifer R.; Wolery, Mark

    2013-01-01

    A systematic analysis was conducted of measurement and reporting practices related to procedural fidelity in single-case research for the past 30 years. Previous reviews of fidelity primarily reported whether fidelity data were collected by authors; these reviews reported that collection was variable, but low across journals and over time. Results…

  19. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    ERIC Educational Resources Information Center

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we…

  20. Perceptions of Weight and Health Practices in Hispanic Children: A Mixed-Methods Study

    PubMed Central

    Foster, Byron Alexander; Hale, Daniel

    2015-01-01

    Background. Perception of weight by parents of obese children may be associated with willingness to engage in behavior change. The relationship between parents' perception of their child's weight and their health beliefs and practices is poorly understood, especially among the Hispanic population which experiences disparities in childhood obesity. This study sought to explore the relationship between perceptions of weight and health beliefs and practices in a Hispanic population. Methods. A cross-sectional, mixed-methods approach was used with semistructured interviews conducted with parent-child (2–5 years old) dyads in a primarily Hispanic, low-income population. Parents were queried on their perceptions of their child's health, health practices, activities, behaviors, and beliefs. A grounded theory approach was used to analyze participants' discussion of health practices and behaviors. Results. Forty parent-child dyads completed the interview. Most (58%) of the parents of overweight and obese children misclassified their child's weight status. The qualitative analysis showed that accurate perception of weight was associated with internal motivation and more concrete ideas of what healthy meant for their child. Conclusions. The qualitative data suggest there may be populations at different stages of readiness for change among parents of overweight and obese children, incorporating this understanding should be considered for interventions. PMID:26379715

  1. A practical and sensitive method of quantitating lymphangiogenesis in vivo.

    PubMed

    Majumder, Mousumi; Xin, Xiping; Lala, Peeyush K

    2013-07-01

    To address the inadequacy of current assays, we developed a directed in vivo lymphangiogenesis assay (DIVLA) by modifying an established directed in vivo angiogenesis assay. Silicon tubes (angioreactors) were implanted in the dorsal flanks of nude mice. Tubes contained either growth factor-reduced basement membrane extract (BME)-alone (negative control) or BME-containing vascular endothelial growth factor (VEGF)-D (positive control for lymphangiogenesis) or FGF-2/VEGF-A (positive control for angiogenesis) or a high VEGF-D-expressing breast cancer cell line MDA-MD-468LN (468-LN), or VEGF-D-silenced 468LN. Lymphangiogenesis was detected superficially with Evans Blue dye tracing and measured in the cellular contents of angioreactors by multiple approaches: lymphatic vessel endothelial hyaluronan receptor-1 (Lyve1) protein (immunofluorescence) and mRNA (qPCR) expression and a visual scoring of lymphatic vs blood capillaries with dual Lyve1 (or PROX-11 or Podoplanin)/Cd31 immunostaining in cryosections. Lymphangiogenesis was absent with BME, high with VEGF-D or VEGF-D-producing 468LN cells and low with VEGF-D-silenced 468LN. Angiogenesis was absent with BME, high with FGF-2/VEGF-A, moderate with 468LN or VEGF-D and low with VEGF-D-silenced 468LN. The method was reproduced in a syngeneic murine C3L5 tumor model in C3H/HeJ mice with dual Lyve1/Cd31 immunostaining. Thus, DIVLA presents a practical and sensitive assay of lymphangiogenesis, validated with multiple approaches and markers. It is highly suited to identifying pro- and anti-lymphangiogenic agents, as well as shared or distinct mechanisms regulating lymphangiogenesis vs angiogenesis, and is widely applicable to research in vascular/tumor biology. PMID:23711825

  2. Practicing oncology in provincial Mexico: a narrative analysis.

    PubMed

    Hunt, L M

    1994-03-01

    This paper examines the discourse of oncologists treating cancer in a provincial capital of southern Mexico. Based on an analysis of both formal interviews and observations of everyday clinical practice, it examines a set of narrative themes they used to maintain a sense of professionalism and possibility as they endeavored to apply a highly technologically dependent biomedical model in a resource-poor context. They moved between coexisting narrative frameworks as they addressed their formidable problems of translating between theory and practice. In a biomedical narrative frame, they drew on biomedical theory to produce a model of cellular dysfunction and of clinical intervention. However, limited availability of diagnostic and treatment techniques and patients inability or unwillingness to comply, presented serious constraints to the application of this model. They used a practical narrative frame to discuss the socio-economic issues they understood to be underlying these limitations to their clinical practice. They did not experience the incongruity between theory and practice as a continual challenge to their biomedical model, nor to their professional competency. Instead, through a reconciling narrative frame, they mediated this conflict. In this frame, they drew on culturally specific concepts of moral rightness and order to produce accounts that minimized the problem, exculpated themselves and cast blame for failed diagnosis and treatment. By invoking these multiple, coexisting narrative themes, the oncologists sustained an open vision of their work in which deficiencies and impotency were vindicated, and did not stand in the way of clinical practice. PMID:8184335

  3. Ethnographic Analysis of Instructional Method.

    ERIC Educational Resources Information Center

    Brooks, Douglas M.

    1980-01-01

    Instructional methods are operational exchanges between participants within environments that attempt to produce a learning outcome. The classroom teacher's ability to produce a learning outcome is the measure of instructional competence within that learning method. (JN)

  4. Methods of Building Cost Analysis.

    ERIC Educational Resources Information Center

    Building Research Inst., Inc., Washington, DC.

    Presentation of symposium papers includes--(1) a study describing techniques for economic analysis of building designs, (2) three case studies of analysis techniques, (3) procedures for measuring the area and volume of buildings, and (4) an open forum discussion. Case studies evaluate--(1) the thermal economics of building enclosures, (2) an…

  5. Methods of stability analysis in nonlinear mechanics

    SciTech Connect

    Warnock, R.L.; Ruth, R.D.; Gabella, W.; Ecklund, K.

    1989-01-01

    We review our recent work on methods to study stability in nonlinear mechanics, especially for the problems of particle accelerators, and compare our ideals to those of other authors. We emphasize methods that (1) show promise as practical design tools, (2) are effective when the nonlinearity is large, and (3) have a strong theoretical basis. 24 refs., 2 figs., 2 tabs.

  6. A Meta-Analysis of Published School Social Work Practice Studies: 1980-2007

    ERIC Educational Resources Information Center

    Franklin, Cynthia; Kim, Johnny S.; Tripodi, Stephen J.

    2009-01-01

    Objective: This systematic review examined the effectiveness of school social work practices using meta-analytic techniques. Method: Hierarchical linear modeling software was used to calculate overall effect size estimates as well as test for between-study variability. Results: A total of 21 studies were included in the final analysis.…

  7. Adapting the Six Category Intervention Analysis To Promote Facilitative Type Supervisory Feedback in Teaching Practice.

    ERIC Educational Resources Information Center

    Hamid, Bahiyah Abdul; Azman, Hazita

    A discussion of the supervision preservice language teacher trainees focuses on supervisory methods designed to facilitate clear, useful, enabling feedback to the trainee. Specifically, it looks at use of the Six Category Intervention Analysis, a model for interpersonal skills training, for supervision of teaching practice. The model is seen here…

  8. Adapting Job Analysis Methodology to Improve Evaluation Practice

    ERIC Educational Resources Information Center

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  9. Content Analysis as a Best Practice in Technical Communication Research

    ERIC Educational Resources Information Center

    Thayer, Alexander; Evans, Mary; McBride, Alicia; Queen, Matt; Spyridakis, Jan

    2007-01-01

    Content analysis is a powerful empirical method for analyzing text, a method that technical communicators can use on the job and in their research. Content analysis can expose hidden connections among concepts, reveal relationships among ideas that initially seem unconnected, and inform the decision-making processes associated with many technical…

  10. Prognostic Analysis System and Methods of Operation

    NASA Technical Reports Server (NTRS)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)

    2014-01-01

    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  11. Coal Field Fire Fighting - Practiced methods, strategies and tactics

    NASA Astrophysics Data System (ADS)

    Wündrich, T.; Korten, A. A.; Barth, U. H.

    2009-04-01

    achieved. For an effective and efficient fire fighting optimal tactics are requiered and can be divided into four fundamental tactics to control fire hazards: - Defense (digging away the coal, so that the coal can not begin to burn; or forming a barrier, so that the fire can not reach the not burning coal), - Rescue the coal (coal mining of a not burning seam), - Attack (active and direct cooling of burning seam), - Retreat (only monitoring till self-extinction of a burning seam). The last one is used when a fire exceeds the organizational and/or technical scope of a mission. In other words, "to control a coal fire" does not automatically and in all situations mean "to extinguish a coal fire". Best-practice tactics or a combination of them can be selected for control of a particular coal fire. For the extinguishing works different extinguishing agents are available. They can be applied by different application techniques and varying distinctive operating expenses. One application method may be the drilling of boreholes from the surface or covering the surface with low permeability soils. The mainly used extinction agents for coal field fire are as followed: Water (with or without additives), Slurry, Foaming mud/slurry, Inert gases, Dry chemicals and materials and Cryogenic agents. Because of its tremendous dimension and its complexity the worldwide challenge of coal fires is absolutely unique - it can only be solved with functional application methods, best fitting strategies and tactics, organisation and research as well as the dedication of the involved fire fighters, who work under extreme individual risks on the burning coal fields.

  12. Practical Aspects of the Equation-Error Method for Aircraft Parameter Estimation

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene a.

    2006-01-01

    Various practical aspects of the equation-error approach to aircraft parameter estimation were examined. The analysis was based on simulated flight data from an F-16 nonlinear simulation, with realistic noise sequences added to the computed aircraft responses. This approach exposes issues related to the parameter estimation techniques and results, because the true parameter values are known for simulation data. The issues studied include differentiating noisy time series, maximum likelihood parameter estimation, biases in equation-error parameter estimates, accurate computation of estimated parameter error bounds, comparisons of equation-error parameter estimates with output-error parameter estimates, analyzing data from multiple maneuvers, data collinearity, and frequency-domain methods.

  13. Convergence analysis of combinations of different methods

    SciTech Connect

    Kang, Y.

    1994-12-31

    This paper provides a convergence analysis for combinations of different numerical methods for solving systems of differential equations. The author proves that combinations of two convergent linear multistep methods or Runge-Kutta methods produce a new convergent method of which the order is equal to the smaller order of the two original methods.

  14. Efficient methods and practical guidelines for simulating isotope effects.

    PubMed

    Ceriotti, Michele; Markland, Thomas E

    2013-01-01

    The shift in chemical equilibria due to isotope substitution is frequently exploited to obtain insight into a wide variety of chemical and physical processes. It is a purely quantum mechanical effect, which can be computed exactly using simulations based on the path integral formalism. Here we discuss how these techniques can be made dramatically more efficient, and how they ultimately outperform quasi-harmonic approximations to treat quantum liquids not only in terms of accuracy, but also in terms of computational cost. To achieve this goal we introduce path integral quantum mechanics estimators based on free energy perturbation, which enable the evaluation of isotope effects using only a single path integral molecular dynamics trajectory of the naturally abundant isotope. We use as an example the calculation of the free energy change associated with H/D and (16)O/(18)O substitutions in liquid water, and of the fractionation of those isotopes between the liquid and the vapor phase. In doing so, we demonstrate and discuss quantitatively the relative benefits of each approach, thereby providing a set of guidelines that should facilitate the choice of the most appropriate method in different, commonly encountered scenarios. The efficiency of the estimators we introduce and the analysis that we perform should in particular facilitate accurate ab initio calculation of isotope effects in condensed phase systems. PMID:23298033

  15. Measurement Practices: Methods for Developing Content-Valid Student Examinations.

    ERIC Educational Resources Information Center

    Bridge, Patrick D.; Musial, Joseph; Frank, Robert; Roe, Thomas; Sawilowsky, Shlomo

    2003-01-01

    Reviews the fundamental principles associated with achieving a high level of content validity when developing tests for students. Suggests that the short-term efforts necessary to develop and integrate measurement theory into practice will lead to long-term gains for students, faculty, and academic institutions. (Includes 21 references.)…

  16. Learning by the Case Method: Practical Approaches for Community Leaders.

    ERIC Educational Resources Information Center

    Stenzel, Anne K.; Feeney, Helen M.

    This supplement to Volunteer Training and Development: A Manual for Community Groups, provides practical guidance in the selection, writing, and adaptation of effective case materials for specific educational objectives, and develops suitable cases for use by analyzing concrete situations and by offering illustrations of various types. An…

  17. Parallel Processable Cryptographic Methods with Unbounded Practical Security.

    ERIC Educational Resources Information Center

    Rothstein, Jerome

    Addressing the problem of protecting confidential information and data stored in computer databases from access by unauthorized parties, this paper details coding schemes which present such astronomical work factors to potential code breakers that security breaches are hopeless in any practical sense. Two procedures which can be used to encode for…

  18. Convex geometry analysis method of hyperspectral data

    NASA Astrophysics Data System (ADS)

    Gong, Yanjun; Wang, XiChang; Qi, Hongxing; Yu, BingXi

    2003-06-01

    We present matrix expression of convex geometry analysis method of hyperspectral data by linear mixing model and establish a mathematic model of endmembers. A 30-band remote sensing image is applied to testify the model. The results of analysis reveal that the method can analyze mixed pixel questions. The targets that are smaller than earth surface pixel can be identified by applying the method.

  19. Methods of DNA methylation analysis.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The purpose of this review was to provide guidance for investigators who are new to the field of DNA methylation analysis. Epigenetics is the study of mitotically heritable alterations in gene expression potential that are not mediated by changes in DNA sequence. Recently, it has become clear that n...

  20. Governance of professional nursing practice in a hospital setting: a mixed methods study1

    PubMed Central

    dos Santos, José Luís Guedes; Erdmann, Alacoque Lorenzini

    2015-01-01

    Objective: to elaborate an interpretative model for the governance of professional nursing practice in a hospital setting. Method: a mixed methods study with concurrent triangulation strategy, using data from a cross-sectional study with 106 nurses and a Grounded Theory study with 63 participants. The quantitative data were collected through the Brazilian Nursing Work Index - Revised and underwent descriptive statistical analysis. Qualitative data were obtained from interviews and analyzed through initial, selective and focused coding. Results: based on the results obtained with the Brazilian Nursing Work Index - Revised, it is possible to state that nurses perceived that they had autonomy, control over the environment, good relationships with physicians and organizational support for nursing governance. The governance of the professional nursing practice is based on the management of nursing care and services carried out by the nurses. To perform these tasks, nurses aim to get around the constraints of the organizational support and develop management knowledge and skills. Conclusion: it is important to reorganize the structures and processes of nursing governance, especially the support provided by the organization for the management practices of nurses. PMID:26625992

  1. Hybrid methods for cybersecurity analysis :

    SciTech Connect

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling and analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and

  2. Practical design methods for barrier pillars. Information circular/1995

    SciTech Connect

    Koehler, J.R.; Tadolini, S.C.

    1995-11-01

    Effective barrier pillar design is essential for safe and productive underground coal mining. This U.S. Bureau of Mines report presents an overview of available barrier pillar design methodologies that incorporate sound engineering principles while remaining practical for everyday usage. Nomographs and examples are presented to assist in the determination of proper barrier pillar sizing. Additionally, performance evaluation techniques and criteria are included to assist in determining the effectiveness of selected barrier pillar configurations.

  3. Analysis methods for photovoltaic applications

    SciTech Connect

    1980-01-01

    Because photovoltaic power systems are being considered for an ever-widening range of applications, it is appropriate for system designers to have knowledge of and access to photovoltaic power systems simulation models and design tools. This brochure gives brief descriptions of a variety of such aids and was compiled after surveying both manufacturers and researchers. Services available through photovoltaic module manufacturers are outlined, and computer codes for systems analysis are briefly described. (WHK)

  4. A Monte Carlo method for combined segregation and linkage analysis.

    PubMed Central

    Guo, S W; Thompson, E A

    1992-01-01

    We introduce a Monte Carlo approach to combined segregation and linkage analysis of a quantitative trait observed in an extended pedigree. In conjunction with the Monte Carlo method of likelihood-ratio evaluation proposed by Thompson and Guo, the method provides for estimation and hypothesis testing. The greatest attraction of this approach is its ability to handle complex genetic models and large pedigrees. Two examples illustrate the practicality of the method. One is of simulated data on a large pedigree; the other is a reanalysis of published data previously analyzed by other methods. PMID:1415253

  5. A Mixed-Method Approach to Investigating the Adoption of Evidence-Based Pain Practices in Nursing Homes

    PubMed Central

    Ersek, Mary; Jablonski, Anita

    2014-01-01

    This mixed methods study examined perceived facilitators and obstacles to adopting evidence-based pain management protocols vis-a-vis documented practice changes that were measured using a chart audit tool. This analysis used data from a subgroup of four nursing homes that participated in a clinical trial. Focus group interviews with staff yielded qualitative data about perceived factors that affected their willingness and ability to use the protocols. Chart audits determined whether pain assessment and management practices changed over time in light of these reported facilitators and barriers. Reported facilitators included administrative support, staff consistency, and policy and procedure changes. Barriers were staff attitudes, regulatory issues, and provider mistrust of nurses’ judgment. Overall, staff reported improvements in pain practices. These reports were corroborated by modest but significant increases in adherence to recommended practices. Change in clinical practice is complex and requires attention to both structural and process aspects of care. PMID:24640959

  6. A mixed-methods approach to investigating the adoption of evidence-based pain practices in nursing homes.

    PubMed

    Ersek, Mary; Jablonski, Anita

    2014-07-01

    This mixed methods study examined perceived facilitators and obstacles to adopting evidence-based pain management protocols vis-a-vis documented practice changes that were measured using a chart audit tool. This analysis used data from a subgroup of four nursing homes that participated in a clinical trial. Focus group interviews with staff yielded qualitative data about perceived factors that affected their willingness and ability to use the protocols. Chart audits determined whether pain assessment and management practices changed over time in light of these reported facilitators and barriers. Reported facilitators included administrative support, staff consistency, and policy and procedure changes. Barriers were staff attitudes, regulatory issues, and provider mistrust of nurses' judgment. Overall, staff reported improvements in pain practices. These reports were corroborated by modest but significant increases in adherence to recommended practices. Change in clinical practice is complex and requires attention to both structural and process aspects of care. PMID:24640959

  7. Microparticle analysis system and method

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    2007-01-01

    A device for analyzing microparticles is provided which includes a chamber with an inlet and an outlet for respectively introducing and dispensing a flowing fluid comprising microparticles, a light source for providing light through the chamber and a photometer for measuring the intensity of light transmitted through individual microparticles. The device further includes an imaging system for acquiring images of the fluid. In some cases, the device may be configured to identify and determine a quantity of the microparticles within the fluid. Consequently, a method for identifying and tracking microparticles in motion is contemplated herein. The method involves flowing a fluid comprising microparticles in laminar motion through a chamber, transmitting light through the fluid, measuring the intensities of the light transmitted through the microparticles, imaging the fluid a plurality of times and comparing at least some of the intensities of light between different images of the fluid.

  8. Current status of methods for shielding analysis

    SciTech Connect

    Engle, W.W.

    1980-01-01

    Current methods used in shielding analysis and recent improvements in those methods are discussed. The status of methods development is discussed based on needs cited at the 1977 International Conference on Reactor Shielding. Additional areas where methods development is needed are discussed.

  9. Comparison of Manual Versus Automated Data Collection Method for an Evidence-Based Nursing Practice Study

    PubMed Central

    Byrne, M.D.; Jordan, T.R.; Welle, T.

    2013-01-01

    Objective The objective of this study was to investigate and improve the use of automated data collection procedures for nursing research and quality assurance. Methods A descriptive, correlational study analyzed 44 orthopedic surgical patients who were part of an evidence-based practice (EBP) project examining post-operative oxygen therapy at a Midwestern hospital. The automation work attempted to replicate a manually-collected data set from the EBP project. Results Automation was successful in replicating data collection for study data elements that were available in the clinical data repository. The automation procedures identified 32 “false negative” patients who met the inclusion criteria described in the EBP project but were not selected during the manual data collection. Automating data collection for certain data elements, such as oxygen saturation, proved challenging because of workflow and practice variations and the reliance on disparate sources for data abstraction. Automation also revealed instances of human error including computational and transcription errors as well as incomplete selection of eligible patients. Conclusion Automated data collection for analysis of nursing-specific phenomenon is potentially superior to manual data collection methods. Creation of automated reports and analysis may require initial up-front investment with collaboration between clinicians, researchers and information technology specialists who can manage the ambiguities and challenges of research and quality assurance work in healthcare. PMID:23650488

  10. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  11. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2012-05-29

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  12. [Dental hygiene indices for dental practice (methods and experiences)].

    PubMed

    Hiltbold, B

    1976-10-01

    An oral hygiene recording sheet for clinical practices with a dental hygienist is described. The recording sheets allow an easy and clear survey of the present oral hygiene status as well as progress or negligence in the performance of oral hygiene procedures. Several oral hygiene indices are described, three of which are recommended for routine examinations: The plaque index of Silness/Löe, the sulcus bleeding index of Mühlemann/Son and the calculus surface index of Ennever et al. The experience of a 3-year use of the oral hygiene recording sheets is described. PMID:1070805

  13. A Practical Guide to Interpretation of Large Collections of Incident Narratives Using the QUORUM Method

    NASA Technical Reports Server (NTRS)

    McGreevy, Michael W.

    1997-01-01

    Analysis of incident reports plays an important role in aviation safety. Typically, a narrative description, written by a participant, is a central part of an incident report. Because there are so many reports, and the narratives contain so much detail, it can be difficult to efficiently and effectively recognize patterns among them. Recognizing and addressing recurring problems, however, is vital to continuing safety in commercial aviation operations. A practical way to interpret large collections of incident narratives is to apply the QUORUM method of text analysis, modeling, and relevance ranking. In this paper, QUORUM text analysis and modeling are surveyed, and QUORUM relevance ranking is described in detail with many examples. The examples are based on several large collections of reports from the Aviation Safety Reporting System (ASRS) database, and a collection of news stories describing the disaster of TWA Flight 800, the Boeing 747 which exploded in mid- air and crashed near Long Island, New York, on July 17, 1996. Reader familiarity with this disaster should make the relevance-ranking examples more understandable. The ASRS examples illustrate the practical application of QUORUM relevance ranking.

  14. Analysis of swimming performance: perceptions and practices of US-based swimming coaches.

    PubMed

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; Newell, John; Quinlan, Leo Richard; ÓLaighin, Gearóid

    2016-06-01

    In elite swimming, a broad range of methods are used to assess performance, inform coaching practices and monitor athletic progression. The aim of this paper was to examine the performance analysis practices of swimming coaches and to explore the reasons behind the decisions that coaches take when analysing performance. Survey data were analysed from 298 Level 3 competitive swimming coaches (245 male, 53 female) based in the United States. Results were compiled to provide a generalised picture of practices and perceptions and to examine key emerging themes. It was found that a disparity exists between the importance swim coaches place on biomechanical analysis of swimming performance and the types of analyses that are actually conducted. Video-based methods are most frequently employed, with over 70% of coaches using these methods at least monthly, with analyses being mainly qualitative in nature rather than quantitative. Barriers to the more widespread use of quantitative biomechanical analysis in elite swimming environments were explored. Constraints include time, cost and availability of resources, but other factors such as sources of information on swimming performance and analysis and control over service provision are also discussed, with particular emphasis on video-based methods and emerging sensor-based technologies. PMID:26359951

  15. Canonical Correlation Analysis: An Explanation with Comments on Correct Practice.

    ERIC Educational Resources Information Center

    Thompson, Bruce

    This paper briefly explains the logic underlying the basic calculations employed in canonical correlation analysis. A small hypothetical data set is employed to illustrate that canonical correlation analysis subsumes both univariate and multivariate parametric methods. Several real data sets are employed to illustrate other themes. Three common…

  16. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures.

    PubMed

    Rock, Adam J; Coventry, William L; Morgan, Methuen I; Loi, Natasha M

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology. PMID:27014147

  17. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures

    PubMed Central

    Rock, Adam J.; Coventry, William L.; Morgan, Methuen I.; Loi, Natasha M.

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology. PMID:27014147

  18. Scharz Preconditioners for Krylov Methods: Theory and Practice

    SciTech Connect

    Szyld, Daniel B.

    2013-05-10

    Several numerical methods were produced and analyzed. The main thrust of the work relates to inexact Krylov subspace methods for the solution of linear systems of equations arising from the discretization of partial di erential equa- tions. These are iterative methods, i.e., where an approximation is obtained and at each step. Usually, a matrix-vector product is needed at each iteration. In the inexact methods, this product (or the application of a preconditioner) can be done inexactly. Schwarz methods, based on domain decompositions, are excellent preconditioners for thise systems. We contributed towards their under- standing from an algebraic point of view, developed new ones, and studied their performance in the inexact setting. We also worked on combinatorial problems to help de ne the algebraic partition of the domains, with the needed overlap, as well as PDE-constraint optimization using the above-mentioned inexact Krylov subspace methods.

  19. Practical methods for detecting mendacity: a case study.

    PubMed

    Hirsch, A R; Wolf, C J

    2001-01-01

    This study demonstrates the concurrence of the use of objective verbal and nonverbal signs and lying. President Clinton's Grand jury Testimony of August 17, 1998, was examined for the presence of 23 clinically practical signs of dissimulation selected from 64 peer-reviewed articles and 20 books on mendacity. A segment of his testimony that was subsequently found to be false was compared with a control period during the same testimony (internal control). A fund-raising speech to a sympathetic crowd served as a second control (external control). The frequencies of the 23 signs in the mendacious speech were compared with their frequencies during the control periods, and the differences were analyzed for statistical significance. No clinical examination was performed nor diagnosis assigned. During the mendacious speech, the subject markedly increased the frequency of 20 out of 23 signs compared with their frequency during the fund-raising control speech (p < .0005). He increased the frequency of 19 signs compared with their frequency during the control period of the same testimony (p < .003). The 23 signs may be useful as indicators of the veracity of videotaped and scripted testimony. If these findings are confirmed through further testing, they could, with practice, be used by psychiatrists conducting interviews. PMID:11785615

  20. The Qualitative Method of Impact Analysis.

    ERIC Educational Resources Information Center

    Mohr, Lawrence B.

    1999-01-01

    Discusses qualitative methods of impact analysis and provides an introductory treatment of one such approach. Combines an awareness of an alternative causal epistemology with current knowledge of qualitative methods of data collection and measurement to produce an approach to the analysis of impacts. (SLD)

  1. Vibration analysis methods for piping

    NASA Astrophysics Data System (ADS)

    Gibert, R. J.

    1981-09-01

    Attention is given to flow vibrations in pipe flow induced by singularity points in the piping system. The types of pressure fluctuations induced by flow singularities are examined, including the intense wideband fluctuations immediately downstream of the singularity and the acoustic fluctuations encountered in the remainder of the circuit, and a theory of noise generation by unsteady flow in internal acoustics is developed. The response of the piping systems to the pressure fluctuations thus generated is considered, and the calculation of the modal characteristics of piping containing a dense fluid in order to obtain the system transfer function is discussed. The TEDEL program, which calculates the vibratory response of a structure composed of straight and curved pipes with variable mechanical characteristics forming a three-dimensional network by a finite element method, is then presented, and calculations of fluid-structural coupling in tubular networks are illustrated.

  2. Estimating free-living human energy expenditure: Practical aspects of the doubly labeled water method and its applications

    PubMed Central

    Kazuko, Ishikawa-Takata; Kim, Eunkyung; Kim, Jeonghyun; Yoon, Jinsook

    2014-01-01

    The accuracy and noninvasive nature of the doubly labeled water (DLW) method makes it ideal for the study of human energy metabolism in free-living conditions. However, the DLW method is not always practical in many developing and Asian countries because of the high costs of isotopes and equipment for isotope analysis as well as the expertise required for analysis. This review provides information about the theoretical background and practical aspects of the DLW method, including optimal dose, basic protocols of two- and multiple-point approaches, experimental procedures, and isotopic analysis. We also introduce applications of DLW data, such as determining the equations of estimated energy requirement and validation studies of energy intake. PMID:24944767

  3. An analysis of revenues and expenses in a hospital-based ambulatory pediatric practice.

    PubMed

    Berkelhamer, J E; Rojek, K J

    1988-05-01

    We developed a method of analyzing revenues and expenses in a hospital-based ambulatory pediatric practice. Results of an analysis of the Children's Medical Group (CMG) at the University of Chicago Medical Center demonstrate how changes in collection rates, practice expenses, and hospital underwriting contribute to the financial outcome of the practice. In this analysis, certain programmatic goals of the CMG are achieved at a level of just under 12,000 patient visits per year. At this activity level, pediatric residency program needs are met and income to the CMG physicians is maximized. An ethical problem from the physician's perspective is created by seeking profit maximization. To accomplish this end, the CMG physicians would have to restrict their personal services to only the better-paying patients. This study serves to underscore the importance of hospital-based physicians and hospital administrators structuring fiscal incentives for physicians that mutually meet the institutional goals for the hospital and its physicians. PMID:3358399

  4. [Pedagogical practices in nursing teaching: a study from the perspective of institutional analysis].

    PubMed

    Pereira, Wilza Rocha; Tavares, Cláudia Mara Melo

    2010-12-01

    The general objective of this study was to learn about the pedagogical practices that are already in use in nursing teaching in order to identify and analyze those that have brought changes and innovation. This field study used a qualitative and comparative approach, and the subjects were nursing professors and students. The data was collected through individual interviews and focal groups. Data analysis was based on the Institutional Analysis method. Several pedagogical practices were recognized, from the most traditional to those considered innovative, and it was noticed that changes are already present and are part of a set of elements caused by the obsolescence of values that are now considered to be insufficient or inappropriate by professors themselves. The study revealed that the activity of teaching and the qualification of the pedagogical practices are always desired by professors. PMID:21337793

  5. Assessing Student Perception of Practice Evaluation Knowledge in Introductory Research Methods

    ERIC Educational Resources Information Center

    Baker, Lisa R.; Pollio, David E.; Hudson, Ashley

    2011-01-01

    The authors explored the use of the Practice Evaluation Knowledge Scale (PEKS) to assess student perception of acquisition and retention of practice evaluation knowledge from an undergraduate research methods class. The authors sampled 2 semesters of undergraduate social work students enrolled in an introductory research methods course.…

  6. Researching into Teaching Methods in Colleges and Universities. Practical Research Series.

    ERIC Educational Resources Information Center

    Bennett, Clinton; And Others

    This practical guide is one of a series aimed at assisting academics in higher education in researching specific aspects of their work. Focusing on small-scale insider research in colleges and universities, the handbook covers contemporary issues, research methods, and existing practice and values in the area of teaching methods. Strategies for…

  7. Learning Practice-Based Research Methods: Capturing the Experiences of MSW Students

    ERIC Educational Resources Information Center

    Natland, Sidsel; Weissinger, Erika; Graaf, Genevieve; Carnochan, Sarah

    2016-01-01

    The literature on teaching research methods to social work students identifies many challenges, such as dealing with the tensions related to producing research relevant to practice, access to data to teach practice-based research, and limited student interest in learning research methods. This is an exploratory study of the learning experiences of…

  8. Strength-based Supervision: Frameworks, Current Practice, and Future Directions A Wu-wei Method.

    ERIC Educational Resources Information Center

    Edwards, Jeffrey K.; Chen, Mei-Whei

    1999-01-01

    Discusses a method of counseling supervision similar to the wu-wei practice in Zen and Taoism. Suggests that this strength-based method and an understanding of isomorphy in supervisory relationships are the preferred practice for the supervision of family counselors. States that this model of supervision potentiates the person-of-the-counselor.…

  9. Investigating the Efficacy of Practical Skill Teaching: A Pilot-Study Comparing Three Educational Methods

    ERIC Educational Resources Information Center

    Maloney, Stephen; Storr, Michael; Paynter, Sophie; Morgan, Prue; Ilic, Dragan

    2013-01-01

    Effective education of practical skills can alter clinician behaviour, positively influence patient outcomes, and reduce the risk of patient harm. This study compares the efficacy of two innovative practical skill teaching methods, against a traditional teaching method. Year three pre-clinical physiotherapy students consented to participate in a…

  10. Probabilistic structural analysis by extremum methods

    NASA Technical Reports Server (NTRS)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  11. Text analysis methods, text analysis apparatuses, and articles of manufacture

    DOEpatents

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  12. Overhead analysis in a surgical practice: a brief communication.

    PubMed

    Frezza, Eldo E

    2006-08-01

    Evaluating overhead is an essential part of any business, including that of the surgeon. By examining each component of overhead, the surgeon will have a better grasp of the profitability of his or her practice. The overhead discussed in this article includes health insurance, overtime, supply costs, rent, advertising and marketing, telephone costs, and malpractice insurance. While the importance of evaluating and controlling overhead in a business is well understood, few know that overhead increases do not always imply increased expenses. National standards have been provided by the Medical Group Management Association. One method of evaluating overhead is to calculate the amount spent in terms of percent of net revenue. Net revenue includes income from patients, from interest, and from insurers less refunds. Another way for surgeons to evaluate their practice is to calculate income and expenses for two years, then calculate the variance between the two years and the percentage of variance to see where they stand. PMID:16968190

  13. Application of the Maximum Entropy Method to Risk Analysis of Mergers and Acquisitions

    NASA Astrophysics Data System (ADS)

    Xie, Jigang; Song, Wenyun

    The maximum entropy (ME) method can be used to analyze the risk of mergers and acquisitions when only pre-acquisition information is available. A practical example of the risk analysis of China listed firms’ mergers and acquisitions is provided to testify the feasibility and practicality of the method.

  14. On Practical Results of the Differential Power Analysis

    NASA Astrophysics Data System (ADS)

    Breier, Jakub; Kleja, Marcel

    2012-03-01

    This paper describes practical differential power analysis attacks. There are presented successful and unsuccessful attack attempts with the description of the attack methodology. It provides relevant information about oscilloscope settings, optimization possibilities and fundamental attack principles, which are important when realizing this type of attack. The attack was conducted on the PIC18F2420 microcontroller, using the AES cryptographic algorithm in the ECB mode with the 128-bit key length. We used two implementations of this algorithm - in the C programming language and in the assembler.

  15. Analysis of flight equipment purchasing practices of representative air carriers

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The process through which representative air carriers decide whether or not to purchase flight equipment was investigated as well as their practices and policies in retiring surplus aircraft. An analysis of the flight equipment investment decision process in ten airlines shows that for the airline industry as a whole, the flight equipment investment decision is in a state of transition from a wholly informal process in earliest years to a much more organized and structured process in the future. Individual air carriers are in different stages with respect to the formality and sophistication associated with the flight equipment investment decision.

  16. Finite-key analysis of a practical decoy-state high-dimensional quantum key distribution

    NASA Astrophysics Data System (ADS)

    Bao, Haize; Bao, Wansu; Wang, Yang; Zhou, Chun; Chen, Ruike

    2016-05-01

    Compared with two-level quantum key distribution (QKD), high-dimensional QKD enables two distant parties to share a secret key at a higher rate. We provide a finite-key security analysis for the recently proposed practical high-dimensional decoy-state QKD protocol based on time-energy entanglement. We employ two methods to estimate the statistical fluctuation of the postselection probability and give a tighter bound on the secure-key capacity. By numerical evaluation, we show the finite-key effect on the secure-key capacity in different conditions. Moreover, our approach could be used to optimize parameters in practical implementations of high-dimensional QKD.

  17. Theory, Method and Practice of Neuroscientific Findings in Science Education

    ERIC Educational Resources Information Center

    Liu, Chia-Ju; Chiang, Wen-Wei

    2014-01-01

    This report provides an overview of neuroscience research that is applicable for science educators. It first offers a brief analysis of empirical studies in educational neuroscience literature, followed by six science concept learning constructs based on the whole brain theory: gaining an understanding of brain function; pattern recognition and…

  18. Investigating the efficacy of practical skill teaching: a pilot-study comparing three educational methods.

    PubMed

    Maloney, Stephen; Storr, Michael; Paynter, Sophie; Morgan, Prue; Ilic, Dragan

    2013-03-01

    Effective education of practical skills can alter clinician behaviour, positively influence patient outcomes, and reduce the risk of patient harm. This study compares the efficacy of two innovative practical skill teaching methods, against a traditional teaching method. Year three pre-clinical physiotherapy students consented to participate in a randomised controlled trial, with concealed allocation and blinded participants and outcome assessment. Each of the three randomly allocated groups were exposed to a different practical skills teaching method (traditional, pre-recorded video tutorial or student self-video) for two specific practical skills during the semester. Clinical performance was assessed using an objective structured clinical examination (OSCE). The students were also administered a questionnaire to gain the participants level of satisfaction with the teaching method, and their perceptions of the teaching methods educational value. There were no significant differences in clinical performance between the three practical skill teaching methods as measured in the OSCE, or for student ratings of satisfaction. A significant difference existed between the methods for the student ratings of perceived educational value, with the teaching approaches of pre-recorded video tutorial and student self-video being rated higher than 'traditional' live tutoring. Alternative teaching methods to traditional live tutoring can produce equivalent learning outcomes when applied to the practical skill development of undergraduate health professional students. The use of alternative practical skill teaching methods may allow for greater flexibility for both staff and infrastructure resource allocation. PMID:22354336

  19. Aural Image in Practice: A Multicase Analysis of Instrumental Practice in Middle School Learners

    ERIC Educational Resources Information Center

    Oare, Steve

    2016-01-01

    This multiple case study examined six adolescent band students engaged in self-directed practice. The students' practice sessions were videotaped. Students provided verbal reports during their practice and again retrospectively while reviewing their video immediately after practice. Students were asked to discuss their choice of practice…

  20. Focus Group Method And Methodology: Current Practice And Recent Debate

    ERIC Educational Resources Information Center

    Parker, Andrew; Tritter, Jonathan

    2006-01-01

    This paper considers the contemporary use of focus groups as a method of data collection within qualitative research settings. The authors draw upon their own experiences of using focus groups in educational and "community" user-group environments in order to provide an overview of recent issues and debates surrounding the deployment of focus…

  1. Practice and Progression in Second Language Research Methods

    ERIC Educational Resources Information Center

    Mackey, Alison

    2014-01-01

    Since its inception, the field of second language research has utilized methods from a number of areas, including general linguistics, psychology, education, sociology, anthropology and, recently, neuroscience and corpus linguistics. As the questions and objectives expand, researchers are increasingly pushing methodological boundaries to gain a…

  2. Methods in Educational Research: From Theory to Practice

    ERIC Educational Resources Information Center

    Lodico, Marguerite G.; Spaulding Dean T.; Voegtle, Katherine H.

    2006-01-01

    Written for students, educators, and researchers, "Methods in Educational Research" offers a refreshing introduction to the principles of educational research. Designed for the real world of educational research, the book's approach focuses on the types of problems likely to be encountered in professional experiences. Reflecting the importance of…

  3. Practical method of diffusion-welding steel plate in air

    NASA Technical Reports Server (NTRS)

    Holko, K. H.; Moore, T. J.

    1971-01-01

    Method is ideal for critical service requirements where parent metal properties are equaled in notch toughness, stress rupture and other characteristics. Welding technique variations may be used on a variety of materials, such as carbon steels, alloy steels, stainless steels, ceramics, and reactive and refractory materials.

  4. Case Methods as a Bridge between Standards and Classroom Practice.

    ERIC Educational Resources Information Center

    Shulman, Judith H.

    This paper examines the function of cases and case methods in teacher education and professional development, hypothesizing that educators and administrators can better make sense of educational standards and link them to their daily school and classroom lives if they can identify cases in which those standards are inherent. One National…

  5. A report on the CCNA 2007 professional practice analysis.

    PubMed

    Muckle, Timothy J; Apatov, Nathaniel M; Plaus, Karen

    2009-06-01

    The purpose of this column is to present the results of the 2007 Professional Practice Analysis (PPA) of the field of nurse anesthesia, conducted by the Council on Certification of Nurse Anesthetists. The PPA used survey and rating scale methodologies to collect data regarding the relative emphasis of various aspects of the nurse anesthesia knowledge domain and competencies. A total of 3,805 survey responses were analyzed using the Rasch rating scale model, which aggregates and transforms ordinal (rating scale) responses into linear measures of relative importance and frequency. Summaries of respondent demographics and educational and professional background are provided, as well as descriptions of how the survey results are used to develop test specifications. The results of this analysis provide evidence for the content outline and test specifications (content percentages) and thus serve as a basis of content validation for the National Certification Examination. PMID:19645167

  6. Parenting Practices and Child Misbehavior: A Mixed Method Study of Italian Mothers and Children

    PubMed Central

    Bombi, Anna Silvia; Di Norcia, Anna; Di Giunta, Laura; Pastorelli, Concetta; Lansford, Jennifer E.

    2015-01-01

    Objective The present study uses a mixed qualitative and quantitative method to examine three main research questions: What are the practices that mothers report they use when trying to correct their children’s misbehaviors? Are there common patterns of these practices? Are the patterns that emerge related to children’s well-being? Design Italian mother-child dyads (N=103) participated in the study (when children were 8 years of age). At Time 1 (T1), mothers answered open-ended questions about discipline; in addition, measures of maternal physical discipline and rejection and child aggression were assessed in mothers and children at T1, one year later (T2), and two years later (T3). Results Mothers’ answers to open-ended questions about what they would do in three disciplinary situations were classified in six categories: physical or psychological punishment, control, mix of force and reasoning, reasoning, listening, and permissiveness. Cluster analysis yielded 3 clusters: Group 1, Induction (predominant use of reasoning and listening; 74%); Group 2, Punishment (punitive practices and no reasoning; 16%); Group 3, Mixed practices (combination of reasoning and punishment, as well as high control and no listening; 10%). Multiple-group latent growth curves of maternal physical discipline, maternal rejection, and child aggression were implemented to evaluate possible differences in the developmental trends from T1 to T3, as a function of cluster. Conclusions Qualitative data deepen understanding of parenting because they shed light on what parents think about themselves; their self-descriptions, in turn, help to identify ways of parenting that may have long-lasting consequences for children’s adjustment. PMID:26877716

  7. A deliberate practice approach to teaching phylogenetic analysis.

    PubMed

    Hobbs, F Collin; Johnson, Daniel J; Kearns, Katherine D

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

  8. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    PubMed Central

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or “one-shot,” in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

  9. Practical flight test method for determining reciprocating engine cooling requirements

    NASA Technical Reports Server (NTRS)

    Ward, D. T.; Miley, S. J.

    1984-01-01

    It is pointed out that efficient and effective cooling of air-cooled reciprocating aircraft engines is a continuing problem for the general aviation industry. Miley et al. (1981) have reported results of a study regarding the controlling variables for cooling and installation aerodynamics. The present investigation is concerned with experimental methods which were developed to determine cooling requirements of an instrumented prototype or production aircraft, taking into account a flight test procedure which has been refined and further verified with additional testing. It is shown that this test procedure represents a straightforward means of determining cooling requirements with minimal instrumentation. Attention is given to some background information, the development history of the NACA cooling correlation method, and the proposed modification of the NACA cooling correlation.

  10. Theories, methods, and practice on the National Atlases of China

    NASA Astrophysics Data System (ADS)

    Qi, Qingwen

    2007-06-01

    The history of editing National Atlases in the world was summarized at first, and follows with China's achievements in editing of the 1st and 2nd version of The National Atlases of China (NAC), which reflected, in multiple levels, China's development of science and technology, society and economy, resources and environment, etc. from 1950s to 1980s. From the previous edition of NAC, systematic theories and methods were summarized and concluded, including comprehensive and statistical mapping theory, designing principle of electronic atlases, and new method the technologies involved in NAC. Then, the New Century Edition of NAC is designed, including its orientation, technological system, volume arrangement, and key scientific and technological problems to be resolved.

  11. Engaging Direct Care Providers in Improving Infection Prevention and Control Practices Using Participatory Visual Methods.

    PubMed

    Backman, Chantal; Bruce, Natalie; Marck, Patricia; Vanderloo, Saskia

    2016-01-01

    The purpose of this quality improvement project was to determine the feasibility of using provider-led participatory visual methods to scrutinize 4 hospital units' infection prevention and control practices. Methods included provider-led photo walkabouts, photo elicitation sessions, and postimprovement photo walkabouts. Nurses readily engaged in using the methods to examine and improve their units' practices and reorganize their work environment. PMID:26681499

  12. Articulating current service development practices: a qualitative analysis of eleven mental health projects

    PubMed Central

    2014-01-01

    Background The utilisation of good design practices in the development of complex health services is essential to improving quality. Healthcare organisations, however, are often seriously out of step with modern design thinking and practice. As a starting point to encourage the uptake of good design practices, it is important to understand the context of their intended use. This study aims to do that by articulating current health service development practices. Methods Eleven service development projects carried out in a large mental health service were investigated through in-depth interviews with six operation managers. The critical decision method in conjunction with diagrammatic elicitation was used to capture descriptions of these projects. Stage-gate design models were then formed to visually articulate, classify and characterise different service development practices. Results Projects were grouped into three categories according to design process patterns: new service introduction and service integration; service improvement; service closure. Three common design stages: problem exploration, idea generation and solution evaluation - were then compared across the design process patterns. Consistent across projects were a top-down, policy-driven approach to exploration, underexploited idea generation and implementation-based evaluation. Conclusions This study provides insight into where and how good design practices can contribute to the improvement of current service development practices. Specifically, the following suggestions for future service development practices are made: genuine user needs analysis for exploration; divergent thinking and innovative culture for idea generation; and fail-safe evaluation prior to implementation. Better training for managers through partnership working with design experts and researchers could be beneficial. PMID:24438471

  13. Triangle area water supply monitoring project, October 1988 through September 2001, North Carolina -- description of the water-quality network, sampling and analysis methods, and quality-assurance practices

    USGS Publications Warehouse

    Oblinger, Carolyn J.

    2004-01-01

    The Triangle Area Water Supply Monitoring Project was initiated in October 1988 to provide long-term water-quality data for six area water-supply reservoirs and their tributaries. In addition, the project provides data that can be used to determine the effectiveness of large-scale changes in water-resource management practices, document differences in water quality among water-supply types (large multiuse reservoir, small reservoir, run-of-river), and tributary-loading and in-lake data for water-quality modeling of Falls and Jordan Lakes. By September 2001, the project had progressed in four phases and included as many as 34 sites (in 1991). Most sites were sampled and analyzed by the U.S. Geological Survey. Some sites were already a part of the North Carolina Division of Water Quality statewide ambient water-quality monitoring network and were sampled by the Division of Water Quality. The network has provided data on streamflow, physical properties, and concentrations of nutrients, major ions, metals, trace elements, chlorophyll, total organic carbon, suspended sediment, and selected synthetic organic compounds. Project quality-assurance activities include written procedures for sample collection, record management and archive, collection of field quality-control samples (blank samples and replicate samples), and monitoring the quality of field supplies. In addition to project quality-assurance activities, the quality of laboratory analyses was assessed through laboratory quality-assurance practices and an independent laboratory quality-control assessment provided by the U.S. Geological Survey Branch of Quality Systems through the Blind Inorganic Sample Project and the Organic Blind Sample Project.

  14. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    DOEpatents

    Sanfilippo, Antonio P.; Cowell, Andrew J.; Gregory, Michelle L.; Baddeley, Robert L.; Paulson, Patrick R.; Tratz, Stephen C.; Hohimer, Ryan E.

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  15. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  16. Practical method for diffusion welding of steel plate in air.

    NASA Technical Reports Server (NTRS)

    Moore, T. J.; Holko, K. H.

    1972-01-01

    Description of a simple and easily applied method of diffusion welding steel plate in air which does not require a vacuum furnace or hot press. The novel feature of the proposed welding method is that diffusion welds are made in air with deadweight loading. In addition, the use of an autogenous (self-generated) surface-cleaning principle (termed 'auto-vac cleaning') to reduce the effects of surface oxides that normally hinder diffusion welding is examined. A series of nine butt joints were diffusion welded in thick sections of AISI 1020 steel plate. Diffusion welds were attempted at three welding temperatures (1200, 1090, and 980 C) using a deadweight pressure of 34,500 N/sq m (5 psi) and a two-hour hold time at temperature. Auto-vac cleaning operations prior to welding were also studied for the same three temperatures. Results indicate that sound welds were produced at the two higher temperatures when the joints were previously fusion seal welded completely around the periphery. Also, auto-vac cleaning at 1200 C for 2-1/2 hours prior to diffusion welding was highly beneficial, particularly when subsequent welding was accomplished at 1090 C.

  17. Imaging Laser Analysis of Building MATERIALS—PRACTICAL Examples

    NASA Astrophysics Data System (ADS)

    Wilsch, G.; Schaurich, D.; Wiggenhauser, H.

    2011-06-01

    The Laser induced Breakdown Spectroscopy (LIBS) is supplement and extension of standard chemical methods and SEM- or Micro-RFA-applications for the evaluation of building materials. As a laboratory method LIBS is used to gain color coded images representing composition, distribution of characteristic ions and/or ingress characteristic of damaging substances. To create a depth profile of element concentration a core has to be taken and split along the core axis. LIBS was proven to be able to detect all important elements in concrete, e. g. Chlorine, Sodium or Sulfur, which are responsible for certain degradation mechanisms and also light elements like lithium or hydrogen. Practical examples are given and a mobile system for on-site measurements is presented.

  18. Imaging laser analysis of building materials - practical examples

    SciTech Connect

    Wilsch, G.; Schaurich, D.; Wiggenhauser, H.

    2011-06-23

    The Laser induced Breakdown Spectroscopy (LIBS) is supplement and extension of standard chemical methods and SEM- or Micro-RFA-applications for the evaluation of building materials. As a laboratory method LIBS is used to gain color coded images representing composition, distribution of characteristic ions and/or ingress characteristic of damaging substances. To create a depth profile of element concentration a core has to be taken and split along the core axis. LIBS was proven to be able to detect all important elements in concrete, e. g. Chlorine, Sodium or Sulfur, which are responsible for certain degradation mechanisms and also light elements like lithium or hydrogen. Practical examples are given and a mobile system for on-site measurements is presented.

  19. Laboratory theory and methods for sediment analysis

    USGS Publications Warehouse

    Guy, Harold P.

    1969-01-01

    The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.

  20. Testing the quasi-absolute method in photon activation analysis

    SciTech Connect

    Sun, Z. J.; Wells, D.; Starovoitova, V.; Segebade, C.

    2013-04-19

    In photon activation analysis (PAA), relative methods are widely used because of their accuracy and precision. Absolute methods, which are conducted without any assistance from calibration materials, are seldom applied for the difficulty in obtaining photon flux in measurements. This research is an attempt to perform a new absolute approach in PAA - quasi-absolute method - by retrieving photon flux in the sample through Monte Carlo simulation. With simulated photon flux and database of experimental cross sections, it is possible to calculate the concentration of target elements in the sample directly. The QA/QC procedures to solidify the research are discussed in detail. Our results show that the accuracy of the method for certain elements is close to a useful level in practice. Furthermore, the future results from the quasi-absolute method can also serve as a validation technique for experimental data on cross sections. The quasi-absolute method looks promising.

  1. Practical Methods for Locating Abandoned Wells in Populated Areas

    SciTech Connect

    Veloski, G.A.; Hammack, R.W.; Lynn, R.J.

    2007-09-01

    An estimated 12 million wells have been drilled during the 150 years of oil and gas production in the United States. Many old oil and gas fields are now populated areas where the presence of improperly plugged wells may constitute a hazard to residents. Natural gas emissions from wells have forced people from their houses and businesses and have caused explosions that injured or killed people and destroyed property. To mitigate this hazard, wells must be located and properly plugged, a task made more difficult by the presence of houses, businesses, and associated utilities. This paper describes well finding methods conducted by the National Energy Technology Laboratory (NETL) that were effective at two small towns in Wyoming and in a suburb of Pittsburgh, Pennsylvania.

  2. Documenting Elementary Teachers' Sustainability of Instructional Practices: A Mixed Method Case Study

    NASA Astrophysics Data System (ADS)

    Cotner, Bridget A.

    School reform programs focus on making educational changes; however, research on interventions past the funded implementation phase to determine what was sustained is rarely done (Beery, Senter, Cheadle, Greenwald, Pearson, et al., 2005). This study adds to the research on sustainability by determining what instructional practices, if any, of the Teaching SMARTRTM professional development program that was implemented from 2005--2008 in elementary schools with teachers in grades third through eighth were continued, discontinued, or adapted five years post-implementation (in 2013). Specifically, this study sought to answer the following questions: What do teachers who participated in Teaching SMARTRTM and district administrators share about the sustainability of Teaching SMARTRTM practices in 2013? What teaching strategies do teachers who participated in the program (2005--2008) use in their science classrooms five years postimplementation (2013)? What perceptions about the roles of females in science, technology, engineering, and mathematics (STEM) do teachers who participated in the program (2005--2008) have five years later (2013)? And, What classroom management techniques do the teachers who participated in the program (2005--2008) use five years post implementation (2013)? A mixed method approach was used to answer these questions. Quantitative teacher survey data from 23 teachers who participated in 2008 and 2013 were analyzed in SAS v. 9.3. Descriptive statistics were reported and paired t-tests were conducted to determine mean differences by survey factors identified from an exploratory factor analysis, principal axis factoring, and parallel analysis conducted with teacher survey baseline data (2005). Individual teacher change scores (2008 and 2013) for identified factors were computed using the Reliable Change Index statistic. Qualitative data consisted of interviews with two district administrators and three teachers who responded to the survey in both

  3. Practical optimization of Steiner trees via the cavity method

    NASA Astrophysics Data System (ADS)

    Braunstein, Alfredo; Muntoni, Anna

    2016-07-01

    The optimization version of the cavity method for single instances, called Max-Sum, has been applied in the past to the minimum Steiner tree problem on graphs and variants. Max-Sum has been shown experimentally to give asymptotically optimal results on certain types of weighted random graphs, and to give good solutions in short computation times for some types of real networks. However, the hypotheses behind the formulation and the cavity method itself limit substantially the class of instances on which the approach gives good results (or even converges). Moreover, in the standard model formulation, the diameter of the tree solution is limited by a predefined bound, that affects both computation time and convergence properties. In this work we describe two main enhancements to the Max-Sum equations to be able to cope with optimization of real-world instances. First, we develop an alternative ‘flat’ model formulation that allows the relevant configuration space to be reduced substantially, making the approach feasible on instances with large solution diameter, in particular when the number of terminal nodes is small. Second, we propose an integration between Max-Sum and three greedy heuristics. This integration allows Max-Sum to be transformed into a highly competitive self-contained algorithm, in which a feasible solution is given at each step of the iterative procedure. Part of this development participated in the 2014 DIMACS Challenge on Steiner problems, and we report the results here. The performance on the challenge of the proposed approach was highly satisfactory: it maintained a small gap to the best bound in most cases, and obtained the best results on several instances in two different categories. We also present several improvements with respect to the version of the algorithm that participated in the competition, including new best solutions for some of the instances of the challenge.

  4. Method of analysis and quality-assurance practices by the U.S. Geological Survey Organic Geochemistry Research Group; determination of geosmin and methylisoborneol in water using solid-phase microextraction and gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Zimmerman, L.R.; Ziegler, A.C.; Thurman, E.M.

    2002-01-01

    A method for the determination of two common odor-causing compounds in water, geosmin and 2-methylisoborneol, was modified and verified by the U.S. Geological Survey's Organic Geochemistry Research Group in Lawrence, Kansas. The optimized method involves the extraction of odor-causing compounds from filtered water samples using a divinylbenzene-carboxen-polydimethylsiloxane cross-link coated solid-phase microextraction (SPME) fiber. Detection of the compounds is accomplished using capillary-column gas chromatography/mass spectrometry (GC/MS). Precision and accuracy were demonstrated using reagent-water, surface-water, and ground-water samples. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 35 nanograms per liter ranged from 60 to 123 percent for geosmin and from 90 to 96 percent for 2-methylisoborneol. Method detection limits were 1.9 nanograms per liter for geosmin and 2.0 nanograms per liter for 2-methylisoborneol in 45-milliliter samples. Typically, concentrations of 30 and 10 nanograms per liter of geosmin and 2-methylisoborneol, respectively, can be detected by the general public. The calibration range for the method is equivalent to concentrations from 5 to 100 nanograms per liter without dilution. The method is valuable for acquiring information about the production and fate of these odor-causing compounds in water.

  5. Professional Suitability for Social Work Practice: A Factor Analysis

    ERIC Educational Resources Information Center

    Tam, Dora M. Y.; Coleman, Heather; Boey, Kam-Wing

    2012-01-01

    Objective: The purpose of this study was to identify the underlying dimensions of professional suitability. Method: Data were collected from a province-wide mail-out questionnaire surveying 341 participants from a random sample of registered social workers. Results: The use of an exploratory factor analysis identified a 5-factor solution on…

  6. Dynamic mechanical analysis: A practical introduction to techniques and applications

    SciTech Connect

    Menard, K.

    1999-01-01

    This book introduces DMA, its history, and its current position as part of thermal analysis on polymers. It discusses major types of instrumentation, including oscillatory rotational, oscillatory axial, and torsional pendulum. It also describes analytical techniques in terms of utility, quality of data, methods of calibration, and suitability for different types of materials and assesses applications for thermoplastics, thermosetting systems, and thermosets.

  7. Autoethnography as a Method for Reflexive Research and Practice in Vocational Psychology

    ERIC Educational Resources Information Center

    McIlveen, Peter

    2008-01-01

    This paper overviews the qualitative research method of autoethnography and its relevance to research in vocational psychology and practice in career development. Autoethnography is a reflexive means by which the researcher-practitioner consciously embeds himself or herself in theory and practice, and by way of intimate autobiographic account,…

  8. Cross-Continental Reflections on Evaluation Practice: Methods, Use, and Valuing

    ERIC Educational Resources Information Center

    Kallemeyn, Leanne M.; Hall, Jori; Friche, Nanna; McReynolds, Clifton

    2015-01-01

    The evaluation theory tree typology reflects the following three components of evaluation practice: (a) methods, (b) use, and (c) valuing. The purpose of this study was to explore how evaluation practice is conceived as reflected in articles published in the "American Journal of Evaluation" ("AJE") and "Evaluation," a…

  9. Perceived Barriers and Facilitators to School Social Work Practice: A Mixed-Methods Study

    ERIC Educational Resources Information Center

    Teasley, Martell; Canifield, James P.; Archuleta, Adrian J.; Crutchfield, Jandel; Chavis, Annie McCullough

    2012-01-01

    Understanding barriers to practice is a growing area within school social work research. Using a convenience sample of 284 school social workers, this study replicates the efforts of a mixed-method investigation designed to identify barriers and facilitators to school social work practice within different geographic locations. Time constraints and…

  10. Low hardness organisms: Culture methods, sensitivities, and practical applications

    SciTech Connect

    DaCruz, A.; DaCruz, N.; Bird, M.

    1995-12-31

    EPA Regulations require biomonitoring of permitted effluent and stormwater runoff. Several permit locations were studied, in Virginia, that have supply water and or stormwater runoff which ranges in hardness from 5--30 mg/L. Ceriodaphnia dubia (dubia) and Pimephales promelas (fathead minnow) were tested in reconstituted water with hardnesses from 5--30 mg/L. Results indicated osmotic stresses present in the acute tests with the fathead minnow as well as chronic tests for the dubia and the fathead minnow. Culture methods were developed for both organism types in soft (30 mg) reconstituted freshwater. Reproductivity and development for each organisms type meets or exceeds EPA testing requirements for moderately hard organisms. Sensitivities were measured over an 18 month interval using cadmium chloride as a reference toxicant. Additionally, sensitivities were charted in contrast with those of organisms cultured in moderately hard water. The comparison proved that the sensitivities of both the dubia and the fathead minnow cultured in 30 mg water increased, but were within two standard deviations of the organism sensitivities of those cultured in moderately hard water. Latitude for use of organisms cultured in 30 mg was documented for waters ranging in hardness from 10--100 mg/L with no acclimation period required. The stability of the organism sensitivity was also validated. The application was most helpful in stormwater runoff and in effluents where the hardness was 30 mg/L or less.

  11. Methods for Analysis of Outdoor Performance Data (Presentation)

    SciTech Connect

    Jordan, D.

    2011-02-01

    The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted to power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders--utility companies, integrators, investors, and scientists alike. Different methods to determine degradation rates and discrete versus continuous data are presented, and some general best practice methods are outlined. In addition, historical degradation rates and some preliminary analysis with respect to climate are given.

  12. A mixed methods exploration of the team and organizational factors that may predict new graduate nurse engagement in collaborative practice.

    PubMed

    Pfaff, Kathryn A; Baxter, Pamela E; Ploeg, Jenny; Jack, Susan M

    2014-03-01

    Although engagement in collaborative practice is reported to support the role transition and retention of new graduate (NG) nurses, it is not known how to promote collaborative practice among these nurses. This mixed methods study explored the team and organizational factors that may predict NG nurse engagement in collaborative practice. A total of 514 NG nurses from Ontario, Canada completed the Collaborative Practice Assessment Tool. Sixteen NG nurses participated in follow-up interviews. The team and organizational predictors of NG engagement in collaborative practice were as follows: satisfaction with the team (β = 0.278; p = 0.000), number of team strategies (β = 0.338; p = 0.000), participation in a mentorship or preceptorship experience (β = 0.137; p = 0.000), accessibility of manager (β = 0.123; p = 0.001), and accessibility and proximity of educator or professional practice leader (β = 0.126; p = 0.001 and β = 0.121; p = 0.002, respectively). Qualitative analysis revealed the team facilitators to be respect, team support and face-to-face interprofessional interactions. Organizational facilitators included supportive leadership, participation in a preceptorship or mentorship experience and time. Interventions designed to facilitate NG engagement in collaborative practice should consider these factors. PMID:24195680

  13. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2002-01-01

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  14. Measuring solar reflectance - Part II: Review of practical methods

    SciTech Connect

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul

    2010-09-15

    A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23 ], and to within 0.02 for surface slopes up to 12:12 [45 ]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R{sub g,0}{sup *}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R{sub g,0}{sup *} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R{sub g,0}{sup *} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R{sub g,0}{sup *} to within about 0.01. (author)

  15. Measuring solar reflectance Part II: Review of practical methods

    SciTech Connect

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul

    2010-05-14

    A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23{sup o}], and to within 0.02 for surface slopes up to 12:12 [45{sup o}]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R*{sub g,0}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R*{sub g,0} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R*{sub g,0} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R*{sub g,0} to within about 0.01.

  16. Skill analysis part 3: improving a practice skill.

    PubMed

    Price, Bob

    In this, the third and final article in a series on practice skill analysis, attention is given to imaginative ways of improving a practice skill. Having analysed and evaluated a chosen skill in the previous two articles, it is time to look at new ways to proceed. Creative people are able to be analytical and imaginative. The process of careful reasoning involved in analysing and evaluating a skill will not necessarily be used to improve it. To advance a skill, there is a need to engage in more imaginative, free-thinking processes that allow the nurse to think afresh about his or her chosen skill. Suggestions shared in this article are not exhaustive, but the material presented does illustrate measures that in the author's experience seem to have potential. Consideration is given to how the improved skill might be envisaged (an ideal skill in use). The article is illustrated using the case study of empathetic listening, which has been used throughout this series. PMID:22356066

  17. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  18. A practical method of estimating standard error of age in the fission track dating method

    USGS Publications Warehouse

    Johnson, N.M.; McGee, V.E.; Naeser, C.W.

    1979-01-01

    A first-order approximation formula for the propagation of error in the fission track age equation is given by PA = C[P2s+P2i+P2??-2rPsPi] 1 2, where PA, Ps, Pi and P?? are the percentage error of age, of spontaneous track density, of induced track density, and of neutron dose, respectively, and C is a constant. The correlation, r, between spontaneous are induced track densities is a crucial element in the error analysis, acting generally to improve the standard error of age. In addition, the correlation parameter r is instrumental is specifying the level of neutron dose, a controlled variable, which will minimize the standard error of age. The results from the approximation equation agree closely with the results from an independent statistical model for the propagation of errors in the fission-track dating method. ?? 1979.

  19. Practice.

    PubMed

    Chambers, David W

    2008-01-01

    Practice refers to a characteristic way professionals use common standards to customize solutions to a range of problems. Practice includes (a) standards for outcomes and processes that are shared with one's colleagues, (b) a rich repertoire of skills grounded in diagnostic acumen, (c) an ability to see the actual and the ideal and work back and forth between them, (d) functional artistry, and (e) learning by doing that transcends scientific rationality. Communities of practice, such as dental offices, are small groups that work together in interlocking roles to achieve these ends. PMID:19413050

  20. Recruitment ad analysis offers new opportunities to attract GPs to short-staffed practices.

    PubMed

    Hemphill, Elizabeth; Kulik, Carol T

    2013-01-01

    As baby-boomer practitioners exit the workforce, physician shortages present new recruitment challenges for practices seeking GPs. This article reports findings from two studies examining GP recruitment practice. GP recruitment ad content analysis (Study 1) demonstrated that both Internet and print ads emphasize job attributes but rarely present family or practice attributes. Contacts at these medical practices reported that their practices offer distinctive family and practice attributes that could be exploited in recruitment advertising (Study 2). Understaffed medical practices seeking to attract GPs may differentiate their job offerings in a crowded market by incorporating family and/or practice attributes into their ads. PMID:23697854

  1. Method and apparatus for ceramic analysis

    DOEpatents

    Jankowiak, Ryszard J.; Schilling, Chris; Small, Gerald J.; Tomasik, Piotr

    2003-04-01

    The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.

  2. Transonic wing analysis using advanced computational methods

    NASA Technical Reports Server (NTRS)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    This paper discusses the application of three-dimensional computational transonic flow methods to several different types of transport wing designs. The purpose of these applications is to evaluate the basic accuracy and limitations associated with such numerical methods. The use of such computational methods for practical engineering problems can only be justified after favorable evaluations are completed. The paper summarizes a study of both the small-disturbance and the full potential technique for computing three-dimensional transonic flows. Computed three-dimensional results are compared to both experimental measurements and theoretical results. Comparisons are made not only of pressure distributions but also of lift and drag forces. Transonic drag rise characteristics are compared. Three-dimensional pressure distributions and aerodynamic forces, computed from the full potential solution, compare reasonably well with experimental results for a wide range of configurations and flow conditions.

  3. [Analysis of an intercultural clinical practice in a judicial setting].

    PubMed

    Govindama, Yolande

    2007-01-01

    This article analyses an intercultural clinical practice in a legal setting from an anthropological and psychoanalytical perspective, demonstrating necessary reorganizations inherent to the framework. The culture of the new country and its founding myth being implicit to the judicial framework, the professional intervening introduces psychoanalytical references particularly totemic principles and the symbolic father by making genealogy, a universal object of transmission as guarantee of fundamental taboos of humanity. The metacultural perspective in this approach integrates ethnopsychoanalytical principles put forth by Devereux as well as the method although this latter has been adapted to the framework. This approach allows to re-question Devereux's ethnopsychoanalytical principles by opening the debate on the perspective of a psychoanalytical as well as psychiatric. PMID:18253668

  4. Short communication: Practical issues in implementing volatile metabolite analysis for identifying mastitis pathogens.

    PubMed

    Hettinga, Kasper A; de Bok, Frank A M; Lam, Theo J G M

    2015-11-01

    Several parameters for improving volatile metabolite analysis using headspace gas chromatography-mass spectrometry (GC-MS) analysis of volatile metabolites were evaluated in the framework of identification of mastitis-causing pathogens. Previous research showed that the results of such volatile metabolites analysis were comparable with those based on bacteriological culturing. The aim of this study was to evaluate the effect of several method changes on the applicability and potential implementation of this method in practice. The use of a relatively polar column is advantageous, resulting in a faster and less complex chromatographic setup with a higher resolving power yielding higher-quality data. Before volatile metabolite analysis is applied, a minimum incubation of 8h is advised, as reducing incubation time leads to less reliable pathogen identification. Application of GC-MS remained favorable compared with regular gas chromatography. The complexity and cost of a GC-MS system are such that this limits the application of the method in practice for identification of mastitis-causing pathogens. PMID:26342985

  5. Comparison of three evidence-based practice learning assessment methods in dental curricula.

    PubMed

    Al-Ansari, Asim A; El Tantawi, Maha M A

    2015-02-01

    Incorporating evidence-based practice (EBP) training in dental curricula is now an accreditation requirement for dental schools, but questions remain about the most effective ways to assess learning outcomes. The purpose of this study was to evaluate and compare three assessment methods for EBP training and to assess their relation to students' overall course grades. Participants in the study were dental students from two classes who received training in appraising randomized controlled trials (RCTs) and systematic reviews in 2013 at the University of Dammam, Saudi Arabia. Repeated measures analysis of variance was used to compare students' scores on appraisal assignments, scores on multiple-choice question (MCQ) exams in which EBP concepts were applied to clinical scenarios, and scores for self-reported efficacy in appraisal. Regression analysis was used to assess the relationship among the three assessment methods, gender, program level, and overall grade. The instructors had acceptable reliability in scoring the assignments (overall intraclass correlation coefficient=0.60). The MCQ exams had acceptable discrimination indices although their reliability was less satisfactory (Cronbach's alpha=0.46). Statistically significant differences were observed among the three methods with MCQ exams having the lowest overall scores. Variation in the overall course grades was explained by scores on the appraisal assignment and MCQ exams (partial eta-squared=0.52 and 0.24, respectively), whereas score on the self-efficacy questionnaire was not significantly associated with overall grade. The results suggest that self-reported efficacy is not a valid method to assess dental students' RCT appraisal skills, whereas instructor-graded appraisal assignments explained a greater portion of variation in grade and had inherent validity and acceptable consistency and MCQ exams had good construct validity but low internal consistency. PMID:25640619

  6. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2004-03-23

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  7. Advanced reliability method for fatigue analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Wirsching, P. H.

    1984-01-01

    When design factors are considered as random variables and the failure condition cannot be expressed by a closed form algebraic inequality, computations of risk (or probability of failure) may become extremely difficult or very inefficient. This study suggests using a simple and easily constructed second degree polynomial to approximate the complicated limit state in the neighborhood of the design point; a computer analysis relates the design variables at selected points. Then a fast probability integration technique (i.e., the Rackwitz-Fiessler algorithm) can be used to estimate risk. The capability of the proposed method is demonstrated in an example of a low cycle fatigue problem for which a computer analysis is required to perform local strain analysis to relate the design variables. A comparison of the performance of this method is made with a far more costly Monte Carlo solution. Agreement of the proposed method with Monte Carlo is considered to be good.

  8. Advanced analysis methods in particle physics

    SciTech Connect

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  9. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2009-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  10. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2010-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  11. NOA: a novel Network Ontology Analysis method

    PubMed Central

    Wang, Jiguang; Huang, Qiang; Liu, Zhi-Ping; Wang, Yong; Wu, Ling-Yun; Chen, Luonan; Zhang, Xiang-Sun

    2011-01-01

    Gene ontology analysis has become a popular and important tool in bioinformatics study, and current ontology analyses are mainly conducted in individual gene or a gene list. However, recent molecular network analysis reveals that the same list of genes with different interactions may perform different functions. Therefore, it is necessary to consider molecular interactions to correctly and specifically annotate biological networks. Here, we propose a novel Network Ontology Analysis (NOA) method to perform gene ontology enrichment analysis on biological networks. Specifically, NOA first defines link ontology that assigns functions to interactions based on the known annotations of joint genes via optimizing two novel indexes ‘Coverage’ and ‘Diversity’. Then, NOA generates two alternative reference sets to statistically rank the enriched functional terms for a given biological network. We compare NOA with traditional enrichment analysis methods in several biological networks, and find that: (i) NOA can capture the change of functions not only in dynamic transcription regulatory networks but also in rewiring protein interaction networks while the traditional methods cannot and (ii) NOA can find more relevant and specific functions than traditional methods in different types of static networks. Furthermore, a freely accessible web server for NOA has been developed at http://www.aporc.org/noa/. PMID:21543451

  12. Chromatographic methods for analysis of triazine herbicides.

    PubMed

    Abbas, Hana Hassan; Elbashir, Abdalla A; Aboul-Enein, Hassan Y

    2015-01-01

    Gas chromatography (GC) and high-performance liquid chromatography (HPLC) coupled to different detectors, and in combination with different sample extraction methods, are most widely used for analysis of triazine herbicides in different environmental samples. Nowadays, many variations and modifications of extraction and sample preparation methods such as solid-phase microextraction (SPME), hollow fiber-liquid phase microextraction (HF-LPME), stir bar sportive extraction (SBSE), headspace-solid phase microextraction (HS-SPME), dispersive liquid-liquid microextraction (DLLME), dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO), ultrasound-assisted emulsification microextraction (USAEME), and others have been introduced and developed to obtain sensitive and accurate methods for the analysis of these hazardous compounds. In this review, several analytical properties such as linearity, sensitivity, repeatability, and accuracy for each developed method are discussed, and excellent results were obtained for the most of developed methods combined with GC and HPLC techniques for the analysis of triazine herbicides. This review gives an overview of recent publications of the application of GC and HPLC for analysis of triazine herbicides residues in various samples. PMID:25849823

  13. Simplified method for nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Kaufman, A.

    1983-01-01

    A simplified inelastic analysis computer program was developed for predicting the stress-strain history of a thermomechanically cycled structure from an elastic solution. The program uses an iterative and incremental procedure to estimate the plastic strains from the material stress-strain properties and a simulated plasticity hardening model. The simplified method was exercised on a number of problems involving uniaxial and multiaxial loading, isothermal and nonisothermal conditions, and different materials and plasticity models. Good agreement was found between these analytical results and nonlinear finite element solutions for these problems. The simplified analysis program used less than 1 percent of the CPU time required for a nonlinear finite element analysis.

  14. Probabilistic structural analysis methods development for SSME

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1988-01-01

    The development of probabilistic structural analysis methods is a major part of the SSME Structural Durability Program and consists of three program elements: composite load spectra, probabilistic finite element structural analysis, and probabilistic structural analysis applications. Recent progress includes: (1) the effects of the uncertainties of several factors on the HPFP blade temperature pressure and torque, (2) the evaluation of the cumulative distribution function of structural response variables based on assumed uncertainties on primitive structural variables, and (3) evaluation of the failure probability. Collectively, the results obtained demonstrate that the structural durability of critical SSME components can be probabilistically evaluated.

  15. Design analysis, robust methods, and stress classification

    SciTech Connect

    Bees, W.J.

    1993-01-01

    This special edition publication volume is comprised of papers presented at the 1993 ASME Pressure Vessels and Piping Conference, July 25--29, 1993 in Denver, Colorado. The papers were prepared for presentations in technical sessions developed under the auspices of the PVPD Committees on Computer Technology, Design and Analysis, Operations Applications and Components. The topics included are: Analysis of Pressure Vessels and Components; Expansion Joints; Robust Methods; Stress Classification; and Non-Linear Analysis. Individual papers have been processed separately for inclusion in the appropriate data bases.

  16. A Renormalisation Group Method. IV. Stability Analysis

    NASA Astrophysics Data System (ADS)

    Brydges, David C.; Slade, Gordon

    2015-05-01

    This paper is the fourth in a series devoted to the development of a rigorous renormalisation group method for lattice field theories involving boson fields, fermion fields, or both. The third paper in the series presents a perturbative analysis of a supersymmetric field theory which represents the continuous-time weakly self-avoiding walk on . We now present an analysis of the relevant interaction functional of the supersymmetric field theory, which permits a nonperturbative analysis to be carried out in the critical dimension . The results in this paper include: proof of stability of the interaction, estimates which enable control of Gaussian expectations involving both boson and fermion fields, estimates which bound the errors in the perturbative analysis, and a crucial contraction estimate to handle irrelevant directions in the flow of the renormalisation group. These results are essential for the analysis of the general renormalisation group step in the fifth paper in the series.

  17. Practical Performance Analysis for Multiple Information Fusion Based Scalable Localization System Using Wireless Sensor Networks.

    PubMed

    Zhao, Yubin; Li, Xiaofan; Zhang, Sha; Meng, Tianhui; Zhang, Yiwen

    2016-01-01

    In practical localization system design, researchers need to consider several aspects to make the positioning efficiently and effectively, e.g., the available auxiliary information, sensing devices, equipment deployment and the environment. Then, these practical concerns turn out to be the technical problems, e.g., the sequential position state propagation, the target-anchor geometry effect, the Non-line-of-sight (NLOS) identification and the related prior information. It is necessary to construct an efficient framework that can exploit multiple available information and guide the system design. In this paper, we propose a scalable method to analyze system performance based on the Cramér-Rao lower bound (CRLB), which can fuse all of the information adaptively. Firstly, we use an abstract function to represent all of the wireless localization system model. Then, the unknown vector of the CRLB consists of two parts: the first part is the estimated vector, and the second part is the auxiliary vector, which helps improve the estimation accuracy. Accordingly, the Fisher information matrix is divided into two parts: the state matrix and the auxiliary matrix. Unlike the theoretical analysis, our CRLB can be a practical fundamental limit to denote the system that fuses multiple information in the complicated environment, e.g., recursive Bayesian estimation based on the hidden Markov model, the map matching method and the NLOS identification and mitigation methods. Thus, the theoretical results are approaching the real case more. In addition, our method is more adaptable than other CRLBs when considering more unknown important factors. We use the proposed method to analyze the wireless sensor network-based indoor localization system. The influence of the hybrid LOS/NLOS channels, the building layout information and the relative height differences between the target and anchors are analyzed. It is demonstrated that our method exploits all of the available information for

  18. Lost to the NHS: a mixed methods study of why GPs leave practice early in England

    PubMed Central

    Doran, Natasha; Fox, Fiona; Rodham, Karen; Taylor, Gordon; Harris, Michael

    2016-01-01

    Background The loss of GPs in the early stages of their careers is contributing to the GP workforce crisis. Recruitment in the UK remains below the numbers needed to support the demand for GP care. Aim To explore the reasons why GPs leave general practice early. Design and setting A mixed methods study using online survey data triangulated with qualitative interviews. Method Participants were GPs aged <50 years who had left the English Medical Performers List in the last 5 years (2009–2014). A total of 143 early GP leavers participated in an online survey, of which 21 took part in recorded telephone interviews. Survey data were analysed using descriptive statistics, and qualitative data using thematic analysis techniques. Results Reasons for leaving were cumulative and multifactorial. Organisational changes to the NHS have led to an increase in administrative tasks and overall workload that is perceived by GP participants to have fundamentally changed the doctor–patient relationship. Lack of time with patients has compromised the ability to practise more patient-centred care, and, with it, GPs’ sense of professional autonomy and values, resulting in diminished job satisfaction. In this context, the additional pressures of increased patient demand and the negative media portrayal left many feeling unsupported and vulnerable to burnout and ill health, and, ultimately, to the decision to leave general practice. Conclusion To improve retention of young GPs, the pace of administrative change needs to be minimised and the time spent by GPs on work that is not face-to-face patient care reduced. PMID:26740606

  19. Protein-protein interactions: methods for detection and analysis.

    PubMed Central

    Phizicky, E M; Fields, S

    1995-01-01

    The function and activity of a protein are often modulated by other proteins with which it interacts. This review is intended as a practical guide to the analysis of such protein-protein interactions. We discuss biochemical methods such as protein affinity chromatography, affinity blotting, coimmunoprecipitation, and cross-linking; molecular biological methods such as protein probing, the two-hybrid system, and phage display: and genetic methods such as the isolation of extragenic suppressors, synthetic mutants, and unlinked noncomplementing mutants. We next describe how binding affinities can be evaluated by techniques including protein affinity chromatography, sedimentation, gel filtration, fluorescence methods, solid-phase sampling of equilibrium solutions, and surface plasmon resonance. Finally, three examples of well-characterized domains involved in multiple protein-protein interactions are examined. The emphasis of the discussion is on variations in the approaches, concerns in evaluating the results, and advantages and disadvantages of the techniques. PMID:7708014

  20. Mix of methods is needed to identify adverse events in general practice: A prospective observational study

    PubMed Central

    Wetzels, Raymond; Wolters, René; van Weel, Chris; Wensing, Michel

    2008-01-01

    Background The validity and usefulness of incident reporting and other methods for identifying adverse events remains unclear. This study aimed to compare five methods in general practice. Methods In a prospective observational study, with five general practitioners, five methods were applied and compared. The five methods were physician reported adverse events, pharmacist reported adverse events, patients' experiences of adverse events, assessment of a random sample of medical records, and assessment of all deceased patients. Results A total of 68 events were identified using these methods. The patient survey accounted for the highest number of events and the pharmacist reports for the lowest number. No overlap between the methods was detected. The patient survey accounted for the highest number of events and the pharmacist reports for the lowest number. Conclusion A mix of methods is needed to identify adverse events in general practice. PMID:18554418

  1. A mixed methods study of food safety knowledge, practices and beliefs in Hispanic families with young children.

    PubMed

    Stenger, Kristen M; Ritter-Gooder, Paula K; Perry, Christina; Albrecht, Julie A

    2014-12-01

    Children are at a higher risk for foodborne illness. The objective of this study was to explore food safety knowledge, beliefs and practices among Hispanic families with young children (≤10 years of age) living within a Midwestern state. A convergent mixed methods design collected qualitative and quantitative data in parallel. Food safety knowledge surveys were administered (n = 90) prior to exploration of beliefs and practices among six focus groups (n = 52) conducted by bilingual interpreters in community sites in five cities/towns. Descriptive statistics determined knowledge scores and thematic coding unveiled beliefs and practices. Data sets were merged to assess concordance. Participants were female (96%), 35.7 (±7.6) years of age, from Mexico (69%), with the majority having a low education level. Food safety knowledge was low (56% ± 11). Focus group themes were: Ethnic dishes popular, Relating food to illness, Fresh food in home country, Food safety practices, and Face to face learning. Mixed method analysis revealed high self confidence in preparing food safely with low safe food handling knowledge and the presence of some cultural beliefs. On-site Spanish classes and materials were preferred venues for food safety education. Bilingual food safety messaging targeting common ethnic foods and cultural beliefs and practices is indicated to lower the risk of foodborne illness in Hispanic families with young children. PMID:25178898

  2. Spatial dynamics of farming practices in the Seine basin: methods for agronomic approaches on a regional scale.

    PubMed

    Mignolet, C; Schott, C; Benoît, M

    2007-04-01

    A research procedure is proposed which aims to analyse the agricultural spatial dynamics during the last thirty years using two levels of organisation of farming activity: the agricultural production system and the cropping system. Based on methods of statistical mapping and data mining, this procedure involves modelling the diversity of production systems and cropping systems (crop successions and sequences of cultural practices for each crop) in the form of classes independently of their localisation within the basin. It identifies homogeneous regions made up of groups of contiguous agricultural districts which exhibit similar combinations of production systems, crop successions or cultural practices during a given period of time. The results show a major increase in arable farms since 1970 at the expense of dairy farms and mixed cropping/livestock. This trend however appeared to be greatly spatially differentiated according to the agricultural districts, since livestock remained important on the edges of the basin, whereas it practically disappeared in its centre. The crop successions practiced in the basin and the cultural practices used on them also appear to be spatially differentiated, although the link to the production systems is not always clear. Thus it appears pertinent to combine the analysis of the two levels of organisation of the agriculture (methods of land use described by the concept of cropping system, and also the production systems into which the cropping systems fit) in the context of an environmental problem. PMID:17316763

  3. Beginning secondary science teachers' classroom roles and instructional methods: An exploratory study of conflicts within practical theories

    NASA Astrophysics Data System (ADS)

    Rearden, Kristin Theresa

    There are a myriad of factors which influence a teacher' s classroom behaviors. Taken together, these factors are referred to as a teacher's practical theory. Some of the elements of practical theories are perceptions regarding classroom role, impressions of student abilities, reflection on experiences, and content knowledge. First-year teachers, or beginning teachers, are faced with many new challenges as they embark on their endeavor to facilitate the learning of their students. The congruence of the elements within their practical theories of teaching can provide the foundation for consistency within their classroom practices. The researcher investigated two aspects of the practical theories of beginning secondary science teachers. The first aspect was teachers' perceptions of their roles in the classroom The second aspect was teachers' intended instructional methods. Interview data from 27 beginning secondary science teachers who earned their teacher certification from one of three institutions were used for the study. The interviews were analyzed for information regarding the aforementioned aspects. An interview theme analysis (Hewson, Kerby, & Cook, 1995) was completed for each teacher. The characterization of each teacher's role was based on three categories outlined by Fenstermacher and Soltis (1986): Executive, Therapist, and Liberationist. In describing their classroom role, most of the teachers alluded to an Executive-type approach to teaching, in which their concerns regarding conveyance of content, processes or skills were paramount. In many cases, they mentioned the use of more than one instructional method; topics and variability in student learning styles accounted for the implementation of multiple methods. Methods usually included activities or hands-on experiences. Some teachers mentioned a certain "feel" of the classroom that was necessary for student learning. More than two-thirds of the teachers either expressed conflicts in their interview or

  4. Improving educational environment in medical colleges through transactional analysis practice of teachers

    PubMed Central

    Rajan, Marina

    2012-01-01

    Context: A FAIMER (Foundation for Advancement in International Medical Education and Research) fellow organized a comprehensive faculty development program to improve faculty awareness resulting in changed teaching practices and better teacher student relationships using Transactional Analysis (TA). Practicing TA tools help development of ‘awareness’ about intrapersonal and interpersonal processes. Objectives: To improve self-awareness among medical educators.To bring about self-directed change in practices among medical educators.To assess usefulness of TA tools for the same. Methods: An experienced trainer conducted a basic course (12 hours) in TA for faculty members. The PAC model of personality structure, functional fluency model of personal functioning, stroke theory on motivation, passivity and script theories of adult functional styles were taught experientially with examples from the Medical Education Scenario. Self-reported improvement in awareness and changes in practices were assessed immediately after, at three months, and one year after training. Findings: The mean improvement in self-'awareness' is 13.3% (95% C.I 9.3-17.2) among nineteen participants. This persists one year after training. Changes in practices within a year include, collecting feedback, new teaching styles and better relationship with students. Discussion and Conclusions: These findings demonstrate sustainable and measurable improvement in self-awareness by practice of TA tools. Improvement in self-'awareness' of faculty resulted in self-directed changes in teaching practices. Medical faculty has judged the TA tools effective for improving self-awareness leading to self-directed changes. PMID:24358808

  5. Knowledge, illness perceptions and stated clinical practice behaviour in management of gout: a mixed methods study in general practice.

    PubMed

    Spaetgens, Bart; Pustjens, Tobias; Scheepers, Lieke E J M; Janssens, Hein J E M; van der Linden, Sjef; Boonen, Annelies

    2016-08-01

    The objective of the present study is to explore knowledge, illness perceptions and stated practice behaviour in relation to gout in primary care. This is a mixed methods study among 32 general practitioners (GPs). The quantitative assessment included the Gout Knowledge Questionnaire (GKQ; range 0-10; better) and Brief Illness Perceptions Questionnaire (BIPQ; nine items, range 0-10; stronger). Structured individual interviews obtained further qualitative insight into knowledge and perceptions, in the context of daily practice. Among 32 GPs, 18 (56.3 %) were male, mean age 44.4 years (SD 9.6) and mean working experience 17.1 years (SD 9.7). Median score [interquartile ranges (IQR)] on the GKQ was 7.8 [6.7-8.9] and 9.0 [8.0-10.0], when presented as open or multiple-choice questions, respectively. The BIPQ (median; [IQR]) revealed that gout was seen as a chronic disease (8.0; [7.0-9.0]), affecting life and emotions moderately (6.5; [5.0-7.0]), having many severe symptoms (8.0; [7.0-9.0]) and in which treatment could be very helpful (8.0; [7.0-9.0]). Further interviews revealed large variation in specific aspects of knowledge and about gaps concerning indications for uric acid-lowering therapy (UALT), duration of UALT, target serum uric acid (sUA) level or duration of prophylactic treatment. Finally, patients' adherence was not checked systematically. Specific knowledge gaps and discrepancies between perceptions and stated practice behaviour were identified, which might hamper effective management of this well-treatable disease. Improving evidence on the rationale and effectiveness of treatment targets and adherence interventions, tailoring guidelines to general practice and intensification of implementation of guidelines in primary health care seem to be needed. PMID:26898982

  6. Methods to enhance compost practices as an alternative to waste disposal

    SciTech Connect

    Stuckey, H.T.; Hudak, P.F.

    1998-12-31

    Creating practices that are ecologically friendly, economically profitable, and ethically sound is a concept that is slowly beginning to unfold in modern society. In developing such practices, the authors challenge long-lived human behavior patterns and environmental management practices. In this paper, they trace the history of human waste production, describe problems associated with such waste, and explore regional coping mechanisms. Composting projects in north central Texas demonstrate new methods for waste disposal. The authors studied projects conducted by municipalities, schools, agricultural organizations, and individual households. These efforts were examined within the context of regional and statewide solid waste plans. They conclude that: (1) regional composting in north central Texas will substantially reduce the waste stream entering landfills; (2) public education is paramount to establishing alternative waste disposal practices; and (3) new practices for compost will catalyze widespread and efficient production.

  7. Methods for Chemical Analysis of Fresh Waters.

    ERIC Educational Resources Information Center

    Golterman, H. L.

    This manual, one of a series prepared for the guidance of research workers conducting studies as part of the International Biological Programme, contains recommended methods for the analysis of fresh water. The techniques are grouped in the following major sections: Sample Taking and Storage; Conductivity, pH, Oxidation-Reduction Potential,…

  8. Systems and methods for sample analysis

    SciTech Connect

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-01-13

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  9. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, L.M.; Ng, E.G.

    1998-09-29

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

  10. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M.; Ng, Esmond G.

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  11. Analysis methods for tocopherols and tocotrienols

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Tocopherols and tocotrienols, sometimes called tocochromanols or tocols, are also collectively termed Vitamin E. Vitamins A, D, E, and K, are referred to as fat soluble vitamins. Since the discovery of Vitamin E in 1922, many methods have been developed for the analysis of tocopherols and tocotrie...

  12. Systems and methods for sample analysis

    SciTech Connect

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-10-20

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  13. Multiple predictor smoothing methods for sensitivity analysis.

    SciTech Connect

    Helton, Jon Craig; Storlie, Curtis B.

    2006-08-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  14. Female genital cutting and other intra-vaginal practices: implications for TwoDay Method use.

    PubMed

    Aksel, Sarp; Sinai, Irit; Yee, Kimberly Aumack

    2012-09-01

    This report examines the implications of female genital cutting and other intra-vaginal practices for offering the TwoDay Method® of family planning.This fertility awareness-based method relies on the identification of cervicovaginal secretions to identify the fertile window. Female genital cutting and traditional vaginal practices, such as the use of desiccants, may affect the presence or absence of secretions and therefore the woman’s perception of her fertility. These issues and their implications for service delivery of the method are discussed. PMID:23016158

  15. Simplified Analysis Methods for Primary Load Designs at Elevated Temperatures

    SciTech Connect

    Carter, Peter; Jetter, Robert I; Sham, Sam

    2011-01-01

    The use of simplified (reference stress) analysis methods is discussed and illustrated for primary load high temperature design. Elastic methods are the basis of the ASME Section III, Subsection NH primary load design procedure. There are practical drawbacks with this approach, particularly for complex geometries and temperature gradients. The paper describes an approach which addresses these difficulties through the use of temperature-dependent elastic-perfectly plastic analysis. Correction factors are defined to address difficulties traditionally associated with discontinuity stresses, inelastic strain concentrations and multiaxiality. A procedure is identified to provide insight into how this approach could be implemented but clearly there is additional work to be done to define and clarify the procedural steps to bring it to the point where it could be adapted into code language.

  16. Methods for Evaluating Practice Change Toward a Patient-Centered Medical Home

    PubMed Central

    Jaén, Carlos Roberto; Crabtree, Benjamin F.; Palmer, Raymond F.; Ferrer, Robert L.; Nutting, Paul A.; Miller, William L.; Stewart, Elizabeth E.; Wood, Robert; Davila, Marivel; Stange, Kurt C.

    2010-01-01

    PURPOSE Understanding the transformation of primary care practices to patient-centered medical homes (PCMHs) requires making sense of the change process, multilevel outcomes, and context. We describe the methods used to evaluate the country’s first national demonstration project of the PCMH concept, with an emphasis on the quantitative measures and lessons for multimethod evaluation approaches. METHODS The National Demonstration Project (NDP) was a group-randomized clinical trial of facilitated and self-directed implementation strategies for the PCMH. An independent evaluation team developed an integrated package of quantitative and qualitative methods to evaluate the process and outcomes of the NDP for practices and patients. Data were collected by an ethnographic analyst and a research nurse who visited each practice, and from multiple data sources including a medical record audit, patient and staff surveys, direct observation, interviews, and text review. Analyses aimed to provide real-time feedback to the NDP implementation team and lessons that would be transferable to the larger practice, policy, education, and research communities. RESULTS Real-time analyses and feedback appeared to be helpful to the facilitators. Medical record audits provided data on process-of-care outcomes. Patient surveys contributed important information about patient-rated primary care attributes and patient-centered outcomes. Clinician and staff surveys provided important practice experience and organizational data. Ethnographic observations supplied insights about the process of practice development. Most practices were not able to provide detailed financial information. CONCLUSIONS A multimethod approach is challenging, but feasible and vital to understanding the process and outcome of a practice development process. Additional longitudinal follow-up of NDP practices and their patients is needed. PMID:20530398

  17. Best Practices in Teaching Statistics and Research Methods in the Behavioral Sciences [with CD-ROM

    ERIC Educational Resources Information Center

    Dunn, Dana S., Ed.; Smith, Randolph A., Ed.; Beins, Barney, Ed.

    2007-01-01

    This book provides a showcase for "best practices" in teaching statistics and research methods in two- and four-year colleges and universities. A helpful resource for teaching introductory, intermediate, and advanced statistics and/or methods, the book features coverage of: (1) ways to integrate these courses; (2) how to promote ethical conduct;…

  18. [Application of information technology in orthodontics. 3. Practical method for computer aided measurements of orthodontic models].

    PubMed

    Reinhardt, H; Haffner, T; Ifert, F; Malsch, J; Schneider, P

    1989-09-01

    A practical method for computer aided measurements of orthodontic models is introduced here. Measurements values are taken by using in incremental donor with the traditional slide rule. Advantages of this method lead to a higher efficiency of model measurements. PMID:2636504

  19. A Survey of Functional Behavior Assessment Methods Used by Behavior Analysts in Practice

    ERIC Educational Resources Information Center

    Oliver, Anthony C.; Pratt, Leigh A.; Normand, Matthew P.

    2015-01-01

    To gather information about the functional behavior assessment (FBA) methods behavior analysts use in practice, we sent a web-based survey to 12,431 behavior analysts certified by the Behavior Analyst Certification Board. Ultimately, 724 surveys were returned, with the results suggesting that most respondents regularly use FBA methods, especially…

  20. Botulinum toxin type A treatment to the upper face: retrospective analysis of daily practice

    PubMed Central

    Prager, Welf; Huber-Vorländer, Jürgen; Taufig, A Ziah; Imhof, Matthias; Kühne, Ulrich; Weissberg, Ruth; Kuhr, Lars-Peter; Rippmann, Volker; Philipp-Dormston, Wolfgang G; Proebstle, Thomas M; Roth, Claudia; Kerscher, Martina; Ulmann, Claudius; Pavicic, Tatjana

    2012-01-01

    Background Botulinum toxin type A treatment has been used for over 20 years to enhance the appearance of the face. There are several commercially available botulinum toxin type A products used in aesthetic clinical practice. The aim of this retrospective analysis was to compare the clinical efficacy of the most commonly used botulinum toxin type A preparations in daily practice. Methods Physicians from 21 centers in Germany completed questionnaires based on an inspection of subject files for subjects 18 years of age or over who had received at least two, but not more than three, consecutive treatments with incobotulinumtoxinA, onabotulinumtoxinA, or abobotulinumtoxinA within a 12-month period in the previous 2 years. Data on subject and physician satisfaction, treatment intervals, dosages, and safety were collected from 1256 subjects. Results There were no statistically significant differences between incobotulinumtoxinA and onabotulinumtoxinA with respect to physician and subject satisfaction, dosages, and adverse effects experienced. Both botulinum toxin type A preparations were well tolerated and effective in the treatment of upper facial lines. Due to low treatment numbers, abobotulinumtoxinA was not included in the statistical analysis. Conclusion The results of this retrospective analysis confirm the results of prospective clinical trials by demonstrating that, in daily practice, incobotulinumtoxinA and onabotulinumtoxinA are used at a 1:1 dose ratio and display comparable efficacy and safety. PMID:22791996

  1. Understanding Online Teacher Best Practices: A Thematic Analysis to Improve Learning

    ERIC Educational Resources Information Center

    Corry, Michael; Ianacone, Robert; Stella, Julie

    2014-01-01

    The purpose of this study was to examine brick-and-mortar and online teacher best practice themes using thematic analysis and a newly developed theory-based analytic process entitled Synthesized Thematic Analysis Criteria (STAC). The STAC was developed to facilitate the meaningful thematic analysis of research based best practices of K-12…

  2. Power System Transient Stability Analysis through a Homotopy Analysis Method

    SciTech Connect

    Wang, Shaobu; Du, Pengwei; Zhou, Ning

    2014-04-01

    As an important function of energy management systems (EMSs), online contingency analysis plays an important role in providing power system security warnings of instability. At present, N-1 contingency analysis still relies on time-consuming numerical integration. To save computational cost, the paper proposes a quasi-analytical method to evaluate transient stability through time domain periodic solutions’ frequency sensitivities against initial values. First, dynamic systems described in classical models are modified into damping free systems whose solutions are either periodic or expanded (non-convergent). Second, because the sensitivities experience sharp changes when periodic solutions vanish and turn into expanded solutions, transient stability is assessed using the sensitivity. Third, homotopy analysis is introduced to extract frequency information and evaluate the sensitivities only from initial values so that time consuming numerical integration is avoided. Finally, a simple case is presented to demonstrate application of the proposed method, and simulation results show that the proposed method is promising.

  3. Graphical methods for the sensitivity analysis in discriminant analysis

    SciTech Connect

    Kim, Youngil; Anderson-Cook, Christine M.; Dae-Heung, Jang

    2015-09-30

    Similar to regression, many measures to detect influential data points in discriminant analysis have been developed. Many follow similar principles as the diagnostic measures used in linear regression in the context of discriminant analysis. Here we focus on the impact on the predicted classification posterior probability when a data point is omitted. The new method is intuitive and easily interpretative compared to existing methods. We also propose a graphical display to show the individual movement of the posterior probability of other data points when a specific data point is omitted. This enables the summaries to capture the overall pattern of the change.

  4. Graphical methods for the sensitivity analysis in discriminant analysis

    DOE PAGESBeta

    Kim, Youngil; Anderson-Cook, Christine M.; Dae-Heung, Jang

    2015-09-30

    Similar to regression, many measures to detect influential data points in discriminant analysis have been developed. Many follow similar principles as the diagnostic measures used in linear regression in the context of discriminant analysis. Here we focus on the impact on the predicted classification posterior probability when a data point is omitted. The new method is intuitive and easily interpretative compared to existing methods. We also propose a graphical display to show the individual movement of the posterior probability of other data points when a specific data point is omitted. This enables the summaries to capture the overall pattern ofmore » the change.« less

  5. Advanced Analysis Methods in High Energy Physics

    SciTech Connect

    Pushpalatha C. Bhat

    2001-10-03

    During the coming decade, high energy physics experiments at the Fermilab Tevatron and around the globe will use very sophisticated equipment to record unprecedented amounts of data in the hope of making major discoveries that may unravel some of Nature's deepest mysteries. The discovery of the Higgs boson and signals of new physics may be around the corner. The use of advanced analysis techniques will be crucial in achieving these goals. The author discusses some of the novel methods of analysis that could prove to be particularly valuable for finding evidence of any new physics, for improving precision measurements and for exploring parameter spaces of theoretical models.

  6. Probabilistic Computational Methods in Structural Failure Analysis

    NASA Astrophysics Data System (ADS)

    Krejsa, Martin; Kralik, Juraj

    2015-12-01

    Probabilistic methods are used in engineering where a computational model contains random variables. Each random variable in the probabilistic calculations contains uncertainties. Typical sources of uncertainties are properties of the material and production and/or assembly inaccuracies in the geometry or the environment where the structure should be located. The paper is focused on methods for the calculations of failure probabilities in structural failure and reliability analysis with special attention on newly developed probabilistic method: Direct Optimized Probabilistic Calculation (DOProC), which is highly efficient in terms of calculation time and the accuracy of the solution. The novelty of the proposed method lies in an optimized numerical integration that does not require any simulation technique. The algorithm has been implemented in mentioned software applications, and has been used several times in probabilistic tasks and probabilistic reliability assessments.

  7. Degradation of learned skills: Effectiveness of practice methods on visual approach and landing skill retention

    NASA Technical Reports Server (NTRS)

    Sitterley, T. E.; Zaitzeff, L. P.; Berge, W. A.

    1972-01-01

    Flight control and procedural task skill degradation, and the effectiveness of retraining methods were evaluated for a simulated space vehicle approach and landing under instrument and visual flight conditions. Fifteen experienced pilots were trained and then tested after 4 months either without the benefits of practice or with static rehearsal, dynamic rehearsal or with dynamic warmup practice. Performance on both the flight control and procedure tasks degraded significantly after 4 months. The rehearsal methods effectively countered procedure task skill degradation, while dynamic rehearsal or a combination of static rehearsal and dynamic warmup practice was required for the flight control tasks. The quality of the retraining methods appeared to be primarily dependent on the efficiency of visual cue reinforcement.

  8. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    SciTech Connect

    Ronald Laurids Boring

    2010-11-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  9. Text analysis devices, articles of manufacture, and text analysis methods

    DOEpatents

    Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C

    2013-05-28

    Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.

  10. Spectroscopic Chemical Analysis Methods and Apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F.; Reid, Ray D.

    2012-01-01

    This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses

  11. Situational Analysis: Centerless Systems and Human Service Practices

    ERIC Educational Resources Information Center

    Newbury, Janet

    2011-01-01

    Bronfenbrenner's ecological model is a conceptual framework that continues to contribute to human service practices. In the current article, the author describes the possibilities for practice made intelligible by drawing from this framework. She then explores White's "Web of Praxis" model as an important extension of this approach, and proceeds…

  12. Developing Mentors: An Analysis of Shared Mentoring Practices

    ERIC Educational Resources Information Center

    Bower-Phipps, Laura; Klecka, Cari Van Senus; Sature, Amanda L.

    2016-01-01

    Understanding how experienced teachers share and articulate effective mentoring practices can guide efforts to prepare quality mentors. This qualitative study focused on mentoring practices within a teacher-designed student-teaching program conceptualized while the mentor teachers within the program were students in a graduate-level mentoring…

  13. Researching "Practiced Language Policies": Insights from Conversation Analysis

    ERIC Educational Resources Information Center

    Bonacina-Pugh, Florence

    2012-01-01

    In language policy research, "policy" has traditionally been conceptualised as a notion separate from that of "practice". In fact, language practices were usually analysed with a view to evaluate whether a policy is being implemented or resisted to. Recently, however, Spolsky in ("Language policy". Cambridge University press, Cambridge, 2004;…

  14. Comparison of analysis methods for airway quantification

    NASA Astrophysics Data System (ADS)

    Odry, Benjamin L.; Kiraly, Atilla P.; Novak, Carol L.; Naidich, David P.

    2012-03-01

    Diseased airways have been known for several years as a possible contributing factor to airflow limitation in Chronic Obstructive Pulmonary Diseases (COPD). Quantification of disease severity through the evaluation of airway dimensions - wall thickness and lumen diameter - has gained increased attention, thanks to the availability of multi-slice computed tomography (CT). Novel approaches have focused on automated methods of measurement as a faster and more objective means that the visual assessment routinely employed in the clinic. Since the Full-Width Half-Maximum (FWHM) method of airway measurement was introduced two decades ago [1], several new techniques for quantifying airways have been detailed in the literature, but no approach has truly become a standard for such analysis. Our own research group has presented two alternative approaches for determining airway dimensions, one involving a minimum path and the other active contours [2, 3]. With an increasing number of techniques dedicated to the same goal, we decided to take a step back and analyze the differences of these methods. We consequently put to the test our two methods of analysis and the FWHM approach. We first measured a set of 5 airways from a phantom of known dimensions. Then we compared measurements from the three methods to those of two independent readers, performed on 35 airways in 5 patients. We elaborate on the differences of each approach and suggest conclusions on which could be defined as the best one.

  15. Integrated Data Collection Analysis (IDCA) Program - SSST Testing Methods

    SciTech Connect

    Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.; Pollard, Colin J.; Warner, Kirstin F.; Remmers, Daniel L.; Sorensen, Daniel N.; Whinnery, LeRoy L.; Phillips, Jason J.; Shelley, Timothy J.; Reyes, Jose A.; Hsu, Peter C.; Reynolds, John G.

    2013-03-25

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the methods used for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis during the IDCA program. These methods changed throughout the Proficiency Test and the reasons for these changes are documented in this report. The most significant modifications in standard testing methods are: 1) including one specified sandpaper in impact testing among all the participants, 2) diversifying liquid test methods for selected participants, and 3) including sealed sample holders for thermal testing by at least one participant. This effort, funded by the Department of Homeland Security (DHS), is putting the issues of safe handling of these materials in perspective with standard military explosives. The study is adding SSST testing results for a broad suite of different HMEs to the literature. Ultimately the study will suggest new guidelines and methods and possibly establish the SSST testing accuracies needed to develop safe handling practices for HMEs. Each participating testing laboratory uses identical test materials and preparation methods wherever possible. The testing performers involved are Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Indian Head Division, Naval Surface Warfare Center, (NSWC IHD), Sandia National Laboratories (SNL), and Air Force Research Laboratory (AFRL/RXQL). These tests are conducted as a proficiency study in order to establish some consistency in test protocols, procedures, and experiments and to compare results when these testing variables cannot be made consistent.

  16. Measurement methods for human exposure analysis.

    PubMed Central

    Lioy, P J

    1995-01-01

    The general methods used to complete measurements of human exposures are identified and illustrations are provided for the cases of indirect and direct methods used for exposure analysis. The application of the techniques for external measurements of exposure, microenvironmental and personal monitors, are placed in the context of the need to test hypotheses concerning the biological effects of concern. The linkage of external measurements to measurements made in biological fluids is explored for a suite of contaminants. This information is placed in the context of the scientific framework used to conduct exposure assessment. Examples are taken from research on volatile organics and for a large scale problem: hazardous waste sites. PMID:7635110

  17. Forum discussion on probabilistic structural analysis methods

    SciTech Connect

    Rodriguez, E.A.; Girrens, S.P.

    2000-10-01

    The use of Probabilistic Structural Analysis Methods (PSAM) has received much attention over the past several decades due in part to enhanced reliability theories, computational capabilities, and efficient algorithms. The need for this development was already present and waiting at the door step. Automotive design and manufacturing has been greatly enhanced because of PSAM and reliability methods, including reliability-based optimization. This demand was also present in the US Department of Energy (DOE) weapons laboratories in support of the overarching national security responsibility of maintaining the nations nuclear stockpile in a safe and reliable state.

  18. Best practices: applying management analysis of excellence to immunization.

    PubMed

    Wishner, Amy; Aronson, Jerold; Kohrt, Alan; Norton, Gary

    2005-01-01

    The authors applied business management tools to analyze and promote excellence and to evaluate differences between average and above-average immunization peformers in private practices. The authors conducted a pilot study of 10 private practices in Pennsylvania using tools common in management to assess practices' organizational climate and managerial style. Authoritative and coaching styles of physician leaders were common to both groups. Managerial styles that emphasized higher levels of clarity and responsibility managerial styles were evident in the large practices; and rewards and flexibility styles were higher in the small above-average practices. The findings of this pilot study match results seen in high performers in other industries. It concludes that the authoritative style appears to have the most impact on performance. It has interesting implications for training/behavior change to improve immunization rates, along with traditional medical interventions. PMID:15921143

  19. Application of Stacking Technique in ANA: Method and Practice with PKU Seismological Array

    NASA Astrophysics Data System (ADS)

    Liu, J.; Tang, Y.; Ning, J.; Chen, Y. J.

    2010-12-01

    Cross correlation of ambient noise records is now routinely used to get dispersion curve and then do seismic tomography; however little attention has been paid to array techniques. We will present a spacial-stacking method to get high resolution dispersion curves and show practices with the observation data of PKU seismological array. Experiential Green Functions are generally obtained by correlation between two stations, and then the dispersion curves are obtained from the analysis of FTAN. Popular method to get high resolution dispersion curves is using long time records. At the same time, if we want to get effectual signal, the distance between the two stations must be at least 3 times of the longest wavelength. So we need both long time records and appropriate spaced stations. Now we use a new method, special-stacking, which allows shorter observation period and utilizes observations of a group of closely distributed stations to get fine dispersion curves. We correlate observations of every station in the station group with those of a far station, and then stack them together. However we cannot just simply stack them unless the stations in the station group at a circle, of which the center is the far station owing to dispersion characteristics of the Rayleigh waves. Thus we do antidispersion on the observation data of every station in the array, then do stacking. We test the method using the theoretical seismic surface wave records which obtained by qseis06 compiled by Rongjiang Wang both with and without noise. For the cases of three imaginary stations (distance is 1 degree) have the same underground structure and without noise, result is that the center station had the same dispersion with and without spacial-stacking. Then we add noise to the theoretical records. The center station's dispersion curves obtained by our method are much closer to the dispersion curve without noise than contaminated ones. We can see that our method has improved the resolution of

  20. Method of analysis and quality-assurance practices for determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry at the U.S. Geological Survey California District Organic Chemistry Laboratory, 1996-99

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Baker, Lucian M.; Kuivila, Kathryn M.

    2000-01-01

    A method of analysis and quality-assurance practices were developed to study the fate and transport of pesticides in the San Francisco Bay-Estuary by the U.S. Geological Survey. Water samples were filtered to remove suspended-particulate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide and the pesticides were eluted with three cartridge volumes of hexane:diethyl ether (1:1) solution. The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for pesticides ranged from 0.002 to 0.025 microgram per liter for 1-liter samples. Recoveries ranged from 44 to 140 percent for 25 pesticides in samples of organic-free reagent water and Sacramento-San Joaquin Delta and Suisun Bay water fortified at 0.05 and 0.50 microgram per liter. The estimated holding time for pesticides after extraction on C-8 solid-phase extraction cartridges ranged from 10 to 257 days.

  1. Problems in the fingerprints based polycyclic aromatic hydrocarbons source apportionment analysis and a practical solution.

    PubMed

    Zou, Yonghong; Wang, Lixia; Christensen, Erik R

    2015-10-01

    This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). PMID:26208321

  2. Characterization of polarization-independent phase modulation method for practical plug and play quantum cryptography

    NASA Astrophysics Data System (ADS)

    Kwon, Osung; Lee, Min-Soo; Woo, Min Ki; Park, Byung Kwon; Kim, Il Young; Kim, Yong-Su; Han, Sang-Wook; Moon, Sung

    2015-12-01

    We characterized a polarization-independent phase modulation method, called double phase modulation, for a practical plug and play quantum key distribution (QKD) system. Following investigation of theoretical backgrounds, we applied the method to the practical QKD system and characterized the performance through comparing single phase modulation (SPM) and double phase modulation. Consequently, we obtained repeatable and accurate phase modulation confirmed by high visibility single photon interference even for input signals with arbitrary polarization. Further, the results show that only 80% of the bias voltage required in the case of single phase modulation is needed to obtain the target amount of phase modulation.

  3. Probabilistic structural analysis methods and applications

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.

    1988-01-01

    An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.

  4. The use of the SPSA method in ECG analysis.

    PubMed

    Gerencsér, László; Kozmann, György; Vágó, Zsuzsanna; Haraszti, Kristóf

    2002-10-01

    The classification, monitoring, and compression of electrocardiogram (ECG) signals recorded of a single patient over a relatively long period of time is considered. The particular application we have in mind is high-resolution ECG analysis, such as late potential analysis, morphology changes in QRS during arrythmias, T-wave alternants, or the study of drug effects on ventricular activation. We propose to apply a modification of a classical method of cluster analysis or vector quantization. The novelty of our approach is that we use a new distortion measure to quantify the distance of two ECG cycles, and the class-distortion measure is defined using a min-max criterion. The new class-distortion-measure is much more sensitive to outliers than the usual distortion measures using average-distance. The price of this practical advantage is that computational complexity is significantly increased. The resulting nonsmooth optimization problem is solved by an adapted version of the simultaneous perturbation stochastic approximation (SPSA) method of. The main idea is to generate a smooth approximation by a randomization procedure. The viability of the method is demonstrated on both simulated and real data. An experimental comparison with the widely used correlation method is given on real data. PMID:12374333

  5. Finite Volume Methods: Foundation and Analysis

    NASA Technical Reports Server (NTRS)

    Barth, Timothy; Ohlberger, Mario

    2003-01-01

    Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

  6. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)

    2013-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  7. Portraits of Practice: A Cross-Case Analysis of Two First-Grade Teachers and Their Grouping Practices

    ERIC Educational Resources Information Center

    Maloch, Beth; Worthy, Jo; Hampton, Angela; Jordan, Michelle; Hungerford-Kresser, Holly; Semingson, Peggy

    2013-01-01

    This interpretive study provides a cross-case analysis of the literacy instruction of two first-grade teachers, with a particular focus on their grouping practices. One key finding was the way in which these teachers drew upon a district-advocated approach for instruction--an approach to guided reading articulated by Fountas and Pinnell (1996) in…

  8. Missed appointments in general practice: retrospective data analysis from four practices.

    PubMed

    Neal, R D; Lawlor, D A; Allgar, V; Colledge, M; Ali, S; Hassey, A; Portz, C; Wilson, A

    2001-10-01

    Little is known about which patients miss appointments or why they do so. Using routinely collected data from four practices, we aimed to determine whether patients who missed appointments differed in terms of their age, sex, and deprivation scores from those who did not, and to examine differences between the practices with respect to missed appointments. The likelihood of someone missing at least one appointment was independently associated with being female, living in a deprived area, and being a young adult. Living in a deprived area was associated with a threefold increase in the likelihood of missing an appointment, and the extent of this association was the same across all four practices. Interventions aimed at reducing missed appointments need to be based upon these findings. PMID:11677708

  9. BAROS METHOD CRITICAL ANALYSIS (BARIATRIC ANALYSIS AND REPORTING SYSTEM)

    PubMed Central

    NICARETA, Jean Ricardo; de FREITAS, Alexandre Coutinho Teixeira; NICARETA, Sheyla Maris; NICARETA, Cleiton; CAMPOS, Antonio Carlos Ligocki; NASSIF, Paulo Afonso Nunes; MARCHESINI, João Batista

    2015-01-01

    Introduction : Although it has received several criticisms, which is considered to be the most effective method used for global assessment of morbid obesity surgical treatment, still needs to be updated. Objective : Critical analysis of BAROS constitution and method. Method : BAROS as headings was searched in literature review using data from the main bariatric surgery journals until 2009. Results : Where found and assessed 121 papers containing criticisms on BAROS constitution and methodology. It has some failures and few researches show results on the use of this instrument, although it is still considered a standard method. Several authors that used it found imperfections in its methodology and suggested some changes addressed to improving its acceptance, showing the need of developing new methods to qualify the bariatric surgery results. Conclusion: BAROS constitution has failures and its methodology needs to be updated. PMID:26537280

  10. Translating Evidence Into Practice via Social Media: A Mixed-Methods Study

    PubMed Central

    Tunnecliff, Jacqueline; Morgan, Prue; Gaida, Jamie E; Clearihan, Lyn; Sadasivan, Sivalal; Davies, David; Ganesh, Shankar; Mohanty, Patitapaban; Weiner, John; Reynolds, John; Ilic, Dragan

    2015-01-01

    Background Approximately 80% of research evidence relevant to clinical practice never reaches the clinicians delivering patient care. A key barrier for the translation of evidence into practice is the limited time and skills clinicians have to find and appraise emerging evidence. Social media may provide a bridge between health researchers and health service providers. Objective The aim of this study was to determine the efficacy of social media as an educational medium to effectively translate emerging research evidence into clinical practice. Methods The study used a mixed-methods approach. Evidence-based practice points were delivered via social media platforms. The primary outcomes of attitude, knowledge, and behavior change were assessed using a preintervention/postintervention evaluation, with qualitative data gathered to contextualize the findings. Results Data were obtained from 317 clinicians from multiple health disciplines, predominantly from the United Kingdom, Australia, the United States, India, and Malaysia. The participants reported an overall improvement in attitudes toward social media for professional development (P<.001). The knowledge evaluation demonstrated a significant increase in knowledge after the training (P<.001). The majority of respondents (136/194, 70.1%) indicated that the education they had received via social media had changed the way they practice, or intended to practice. Similarly, a large proportion of respondents (135/193, 69.9%) indicated that the education they had received via social media had increased their use of research evidence within their clinical practice. Conclusions Social media may be an effective educational medium for improving knowledge of health professionals, fostering their use of research evidence, and changing their clinical behaviors by translating new research evidence into clinical practice. PMID:26503129

  11. New approaches to fertility awareness-based methods: incorporating the Standard Days and TwoDay Methods into practice.

    PubMed

    Germano, Elaine; Jennings, Victoria

    2006-01-01

    Helping clients select and use appropriate family planning methods is a basic component of midwifery care. Many women prefer nonhormonal, nondevice methods, and may be interested in methods that involve understanding their natural fertility. Two new fertility awareness-based methods, the Standard Days Method and the TwoDay Method, meet the need for effective, easy-to-provide, easy-to-use approaches. The Standard Days Method is appropriate for women with most menstrual cycles between 26 and 32 days long. Women using this method are taught to avoid unprotected intercourse on potentially fertile days 8 through 19 of their cycles to prevent pregnancy. They use CycleBeads, a color-coded string of beads representing the menstrual cycle, to monitor their cycle days and cycle lengths. The Standard Days Method is more than 95% effective with correct use. The TwoDay Method is based on the presence or absence of cervical secretions to identify fertile days. To use this method, women are taught to note everyday whether they have secretions. If they had secretions on the current day or the previous day, they consider themselves fertile. The TwoDay Method is 96% effective with correct use. Both methods fit well into midwifery practice. PMID:17081938

  12. An Analysis of State Autism Educational Assessment Practices and Requirements.

    PubMed

    Barton, Erin E; Harris, Bryn; Leech, Nancy; Stiff, Lillian; Choi, Gounah; Joel, Tiffany

    2016-03-01

    States differ in the procedures and criteria used to identify ASD. These differences are likely to impact the prevalence and age of identification for children with ASD. The purpose of the current study was to examine the specific state variations in ASD identification and eligibility criteria requirements. We examined variations by state in autism assessment practices and the proportion of children eligible for special education services under the autism category. Overall, our findings suggest that ASD identification practices vary across states, but most states use federal guidelines, at least in part, to set their requirements. Implications and recommendations for policy and practice are discussed. PMID:26363913

  13. Putting social impact assessment to the test as a method for implementing responsible tourism practice

    SciTech Connect

    McCombes, Lucy; Vanclay, Frank; Evers, Yvette

    2015-11-15

    The discourse on the social impacts of tourism needs to shift from the current descriptive critique of tourism to considering what can be done in actual practice to embed the management of tourism's social impacts into the existing planning, product development and operational processes of tourism businesses. A pragmatic approach for designing research methodologies, social management systems and initial actions, which is shaped by the real world operational constraints and existing systems used in the tourism industry, is needed. Our pilot study with a small Bulgarian travel company put social impact assessment (SIA) to the test to see if it could provide this desired approach and assist in implementing responsible tourism development practice, especially in small tourism businesses. Our findings showed that our adapted SIA method has value as a practical method for embedding a responsible tourism approach. While there were some challenges, SIA proved to be effective in assisting the staff of our test case tourism business to better understand their social impacts on their local communities and to identify actions to take. - Highlights: • Pragmatic approach is needed for the responsible management of social impacts of tourism. • Our adapted Social impact Assessment (SIA) method has value as a practical method. • SIA can be embedded into tourism businesses existing ‘ways of doing things’. • We identified challenges and ways to improve our method to better suit small tourism business context.

  14. Comparison between Two Practical Methods of Light Source Monitoring in Quantum Key Distribution

    NASA Astrophysics Data System (ADS)

    Wang, Gan; Chen, Ziyang; Xu, Bingjie; Li, Zhengyu; Peng, Xiang; Guo, Hong

    2016-05-01

    The practical security of a quantum key distribution (QKD) is a critical issue due to the loopholes opened by the imperfections of practical devices. The untrusted source problem is a fundamental issue that exists in almost every protocol, including the loss-tolerant protocol and the measurement-device-independent protocol. Two practical light source monitoring methods were proposed, i.e., two-threshold detector scheme and photon-number-resolving (PNR) detector scheme. In this work, we test the fluctuation level of different gain-switched pulsed lasers, i.e., the ratio between the standard deviation and the mean of the pulse energy (noted as γ) changes from 1% to 7%. Moreover, we propose an improved practical PNR detector scheme, and discuss in what circumstances one should use which light source monitoring method, i.e., generally speaking when the fluctuation is large the PNR detector method performs better. This provides an instruction of selecting proper monitoring module for different practical systems. This work is supported by the National Science Fund for Distinguished Young Scholars of China (Grant No. 61225003), the State Key Project of National Natural Science Foundation of China (Grant No. 61531003).

  15. Effectiveness of a Motivation and Practical Skills Development Methods on the Oral Hygiene of Orphans Children in Kaunas, Lithuania

    PubMed Central

    Narbutaite, Julija

    2015-01-01

    ABSTRACT Objectives The aim of this study was to evaluate the effect of a motivation and practical skills development methods on the oral hygiene of orphans. Material and Methods Sixty eight orphans aged between 7 and 17 years from two orphanages in Kaunas were divided into two groups: practical application group and motivation group. Children were clinically examined by determining their oral hygiene status using Silness-Löe plaque index. Questionnaire was used to estimate the oral hygiene knowledge and practices at baseline and after 3 months. Statistical analysis included: Chi-square test (χ2), Fisher‘s exact test, Student‘s t-test, nonparametric Mann-Whitney test, Spearman’s rho correlation coefficient and Kappa coefficient. Results All children had a plaque on at least one tooth in both groups: motivation 1.14 (SD 0.51), practical application 1.08 (SD 0.4) (P = 0.58). Girls in both groups showed significantly better oral hygiene than boys (P < 0.001). After 3 months educational program oral hygiene status improved in both groups significantly 0.4 (SD 0.35) (P < 0.001). Significantly better oral hygiene was determined in practical application group 0.19 (SD 0.27) in comparison with motivation group 0.55 (SD 0.32) (P < 0.001). By comparing results of first and second questionnaire surveys on use of soft drinks, the statistically significant decline of their use was in both groups (P = 0.004). Conclusions Educational programs are effective in improving oral hygiene, especially when they’re based on practical skills training. PMID:26539284

  16. Optical methods for the analysis of dermatopharmacokinetics

    NASA Astrophysics Data System (ADS)

    Lademann, Juergen; Weigmann, Hans-Juergen; von Pelchrzim, R.; Sterry, Wolfram

    2002-07-01

    The method of tape stripping in combination with spectroscopic measurements is a simple and noninvasive method for the analysis of dermatopharmacokinetics of cosmetic products and topically applied drugs. The absorbance at 430 nm was used for the characterization of the amount of corneocytes on the tape strips. It was compared to the increase of weight of the tapes after removing them from the skin surface. The penetration profiles of two UV filter substances used in sunscreens were determined. The combined method of tape stripping and spectroscopic measurements can be also used for the investigation of the dermatopharmacokinetics of topically applied drugs passing through the skin. Differences in the penetration profiles of the steroid compound clobetasol, applied in the same concentration in different formulations on the skin are presented.

  17. Data Analysis Methods for Library Marketing

    NASA Astrophysics Data System (ADS)

    Minami, Toshiro; Kim, Eunja

    Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

  18. Review of Computational Stirling Analysis Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  19. Impact of pedagogical method on Brazilian dental students' waste management practice.

    PubMed

    Victorelli, Gabriela; Flório, Flávia Martão; Ramacciato, Juliana Cama; Motta, Rogério Heládio Lopes; de Souza Fonseca Silva, Almenara

    2014-11-01

    The purpose of this study was to conduct a qualitative analysis of waste management practices among a group of Brazilian dental students (n=64) before and after implementing two different pedagogical methods: 1) the students attended a two-hour lecture based on World Health Organization standards; and 2) the students applied the lessons learned in an organized group setting aimed toward raising their awareness about socioenvironmental issues related to waste. All eligible students participated, and the students' learning was evaluated through their answers to a series of essay questions, which were quantitatively measured. Afterwards, the impact of the pedagogical approaches was compared by means of qualitative categorization of wastes generated in clinical activities. Waste categorization was performed for a period of eight consecutive days, both before and thirty days after the pedagogical strategies. In the written evaluation, 80 to 90 percent of the students' answers were correct. The qualitative assessment revealed a high frequency of incorrect waste disposal with a significant increase of incorrect disposal inside general and infectious waste containers (p<0.05). Although the students' theoretical learning improved, it was not enough to change behaviors established by cultural values or to encourage the students to adequately segregate and package waste material. PMID:25362694

  20. A numerical method for the stress analysis of stiffened-shell structures under nonuniform temperature distributions

    NASA Technical Reports Server (NTRS)

    Heldenfels, Richard R

    1951-01-01

    A numerical method is presented for the stress analysis of stiffened-shell structures of arbitrary cross section under nonuniform temperature distributions. The method is based on a previously published procedure that is extended to include temperature effects and multicell construction. The application of the method to practical problems is discussed and an illustrative analysis is presented of a two-cell box beam under the combined action of vertical loads and a nonuniform temperature distribution.

  1. Internet Practices of Certified Rehabilitation Counselors and Analysis of Guidelines for Ethical Internet Practices

    ERIC Educational Resources Information Center

    Lehmann, Ilana S.; Crimando, William

    2011-01-01

    The Internet has become an integral part of the practice of rehabilitation counseling. To identify potential ethical issues regarding the use of the Internet by counselors, two studies were conducted. In Study 1, we surveyed a national sample of rehabilitation counselors regarding their use of technology in their work and home settings. Results…

  2. Practical post-calibration uncertainty analysis: Yucca Mountain, Nevada, USA

    NASA Astrophysics Data System (ADS)

    James, S. C.; Doherty, J.; Eddebbarh, A.

    2009-12-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is “calibrated.” Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the US’s proposed site for disposal of high-level radioactive waste. Both of these types uncertainty analysis are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction’s uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This paper applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. Furthermore, Monte Carlo simulations confirm that hydrogeologic units thought to be flow barriers have probability distributions skewed toward lower permeabilities.

  3. Analysis of structural perturbations in systems via cost decomposition methods

    NASA Technical Reports Server (NTRS)

    Skelton, R. E.

    1983-01-01

    It has long been common practice to analyze linear dynamic systems by decomposing the total response in terms of individual contributions which are easier to analyze. Examples of this philosophy include the expansion of transfer functions using: (1) the superposition principle, (2) residue theory and partial fraction expansions, (3) Markov parameters, Hankel matrices, and (4) regular and singular perturbations. This paper summarizes a new and different kind of expansion designed to decompose the norm of the response vector rather than the response vector itself. This is referred to as "cost-decomposition' of the system. The notable advantages of this type of decomposition are: (a) easy application to multi-input, multi-output systems, (b) natural compatibility with Linear Quadratic Gaussian Theory, (c) applicability to the analysis of more general types of structural perturbations involving inputs, outputs, states, parameters. Property (c) makes the method suitable for problems in model reduction, measurement/actuator selections, and sensitivity analysis.

  4. Practical Implementation of New Particle Tracking Method to the Real Field of Groundwater Flow and Transport

    PubMed Central

    Suk, Heejun

    2012-01-01

    Abstract In articles published in 2009 and 2010, Suk and Yeh reported the development of an accurate and efficient particle tracking algorithm for simulating a path line under complicated unsteady flow conditions, using a range of elements within finite elements in multidimensions. Here two examples, an aquifer storage and recovery (ASR) example and a landfill leachate migration example, are examined to enhance the practical implementation of the proposed particle tracking method, known as Suk's method, to a real field of groundwater flow and transport. Results obtained by Suk's method are compared with those obtained by Pollock's method. Suk's method produces superior tracking accuracy, which suggests that Suk's method can describe more accurately various advection-dominated transport problems in a real field than existing popular particle tracking methods, such as Pollock's method. To illustrate the wide and practical applicability of Suk's method to random-walk particle tracking (RWPT), the original RWPT has been modified to incorporate Suk's method. Performance of the modified RWPT using Suk's method is compared with the original RWPT scheme by examining the concentration distributions obtained by the modified RWPT and the original RWPT under complicated transient flow systems. PMID:22476629

  5. Foundational methods for model verification and uncertainty analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Jakeman, A. J.; Croke, B. F.; Guillaume, J. H.; Jakeman, J. D.; Shin, M.

    2013-12-01

    Before embarking on formal methods of uncertainty analysis that may entail unnecessarily restrictive assumptions and sophisticated treatment, prudence dictates exploring one's data, model candidates and applicable objective functions with a mixture of methods as a first step. It seems that there are several foundational methods that warrant more attention in practice and that there is scope for the development of new ones. Ensuing results from a selection of foundational methods may well inform the choice of formal methods and assumptions, or suffice in themselves as an effective appreciation of uncertainty. Through the case of four lumped rainfall-runoff models of varying complexity from several watersheds we illustrate that there are valuable methods, many of them already in open source software, others we have recently developed, which can be invoked to yield valuable insights into model veracity and uncertainty. We show results of using methods of global sensitivity analysis that help: determine whether insensitive parameters impact on predictions and therefore cannot be fixed; and identify which combinations of objective function, dataset and model structure allow insensitive parameters to be estimated. We apply response surface and polynomial chaos methods to yield knowledge of the models' response surfaces and parameter interactions, thereby informing model redesign. A new approach to model structure discrimination is presented based on Pareto methods and cross-validation. It reveals which model structures are acceptable in the sense that they are non-dominated by other structures across calibration and validation periods and across catchments according to specified performance criteria. Finally we present and demonstrate a falsification approach that shows the value of examining scenarios of model structures and parameters to identify any change that might have a specified effect on a prediction.

  6. A Practical Test Method for Mode I Fracture Toughness of Adhesive Joints with Dissimilar Substrates

    SciTech Connect

    Boeman, R.G.; Erdman, D.L.; Klett, L.B.; Lomax, R.D.

    1999-09-27

    A practical test method for determining the mode I fracture toughness of adhesive joints with dissimilar substrates will be discussed. The test method is based on the familiar Double Cantilever Beam (DCB) specimen geometry, but overcomes limitations in existing techniques that preclude their use when testing joints with dissimilar substrates. The test method is applicable to adhesive joints where the two bonded substrates have different flexural rigidities due to geometric and/or material considerations. Two specific features discussed are the use of backing beams to prevent substrate damage and a compliance matching scheme to achieve symmetric loading conditions. The procedure is demonstrated on a modified DCB specimen comprised of SRIM composite and thin-section, e-coat steel substrates bonded with an epoxy adhesive. Results indicate that the test method provides a practical means of characterizing the mode I fracture toughness of joints with dissimilar substrates.

  7. Passive Sampling Methods for Contaminated Sediments: Practical Guidance for Selection, Calibration, and Implementation

    EPA Science Inventory

    This article provides practical guidance on the use of passive sampling methods(PSMs) that target the freely dissolved concentration (Cfree) for improved exposure assessment of hydrophobic organic chemicals in sediments. Primary considerations for selecting a PSM for a specific a...

  8. Examination of Quantitative Methods Used in Early Intervention Research: Linkages with Recommended Practices.

    ERIC Educational Resources Information Center

    Snyder, Patricia; Thompson, Bruce; McLean, Mary E.; Smith, Barbara J.

    2002-01-01

    Findings are reported related to the research methods and statistical techniques used in 450 group quantitative studies examined by the Council for Exceptional Children's Division for Early Childhood Recommended Practices Project. Studies were analyzed across seven dimensions including sampling procedures, variable selection, variable definition,…

  9. Econometric Methods for Causal Evaluation of Education Policies and Practices: A Non-Technical Guide

    ERIC Educational Resources Information Center

    Schlotter, Martin; Schwerdt, Guido; Woessmann, Ludger

    2011-01-01

    Education policy-makers and practitioners want to know which policies and practices can best achieve their goals. But research that can inform evidence-based policy often requires complex methods to distinguish causation from accidental association. Avoiding econometric jargon and technical detail, this paper explains the main idea and intuition…

  10. Connecting Practice, Theory and Method: Supporting Professional Doctoral Students in Developing Conceptual Frameworks

    ERIC Educational Resources Information Center

    Kumar, Swapna; Antonenko, Pavlo

    2014-01-01

    From an instrumental view, conceptual frameworks that are carefully assembled from existing literature in Educational Technology and related disciplines can help students structure all aspects of inquiry. In this article we detail how the development of a conceptual framework that connects theory, practice and method is scaffolded and facilitated…

  11. Self-Assessment Methods in Writing Instruction: A Conceptual Framework, Successful Practices and Essential Strategies

    ERIC Educational Resources Information Center

    Nielsen, Kristen

    2014-01-01

    Student writing achievement is essential to lifelong learner success, but supporting writing can be challenging for teachers. Several large-scale analyses of publications on writing have called for further study of instructional methods, as the current literature does not sufficiently address the need to support best teaching practices.…

  12. The Delphi Method: An Approach for Facilitating Evidence Based Practice in Athletic Training

    ERIC Educational Resources Information Center

    Sandrey, Michelle A.; Bulger, Sean M.

    2008-01-01

    Objective: The growing importance of evidence based practice in athletic training is necessitating academics and clinicians to be able to make judgments about the quality or lack of the body of research evidence and peer-reviewed standards pertaining to clinical questions. To assist in the judgment process, consensus methods, namely brainstorming,…

  13. What Informs Practice and What Is Valued in Corporate Instructional Design? A Mixed Methods Study

    ERIC Educational Resources Information Center

    Thompson-Sellers, Ingrid N.

    2012-01-01

    This study used a two-phased explanatory mixed-methods design to explore in-depth what factors are perceived by Instructional Design and Technology (IDT) professionals as impacting instructional design practice, how these factors are valued in the field, and what differences in perspectives exist between IDT managers and non-managers. For phase 1…

  14. Trusting the Method: An Ethnographic Search for Policy in Practice in an Australian Primary School

    ERIC Educational Resources Information Center

    Robinson, Sarah

    2008-01-01

    The apparent simplicity of ethnographic methods--studying people in their normal life setting, going beyond what might be said in surveys and interviews to observe everyday practices--is deceptive. Anthropological knowledge is gained through fieldwork and through pursuing a reflexive flexible approach. This study carried out in a non-government…

  15. Simple gas chromatographic method for furfural analysis.

    PubMed

    Gaspar, Elvira M S M; Lopes, João F

    2009-04-01

    A new, simple, gas chromatographic method was developed for the direct analysis of 5-hydroxymethylfurfural (5-HMF), 2-furfural (2-F) and 5-methylfurfural (5-MF) in liquid and water soluble foods, using direct immersion SPME coupled to GC-FID and/or GC-TOF-MS. The fiber (DVB/CAR/PDMS) conditions were optimized: pH effect, temperature, adsorption and desorption times. The method is simple and accurate (RSD<8%), showed good recoveries (77-107%) and good limits of detection (GC-FID: 1.37 microgL(-1) for 2-F, 8.96 microgL(-1) for 5-MF, 6.52 microgL(-1) for 5-HMF; GC-TOF-MS: 0.3, 1.2 and 0.9 ngmL(-1) for 2-F, 5-MF and 5-HMF, respectively). It was applied to different commercial food matrices: honey, white, demerara, brown and yellow table sugars, and white and red balsamic vinegars. This one-step, sensitive and direct method for the analysis of furfurals will contribute to characterise and quantify their presence in the human diet. PMID:18976770

  16. Intercomparison of two nowcasting methods: preliminary analysis

    NASA Astrophysics Data System (ADS)

    Poli, V.; Alberoni, P. P.; Cesari, D.

    2008-10-01

    With term nowcasting is intended the description of a weather situation and its consequent extrapolation ahead in the future for few hours. This work gives a brief description of actual nowcasting methods deepening those developed at ARPA-SIM (Emilia-Romagna region, Italy). The methodology used rests on an extrapolation technique that analyses a series of radar reflectivity fields in order to identify areas of precipitation and determine the motion field which allows the tracking of coherent structures from an image to the next one. Motion of individual rainfall structures is extrapolated using two different methods: a linear translation and a semi-Lagrangian advection scheme. In particular semi-Lagrangian advection method is based on a multi-scale recursive cross-correlation analysis, where different targets are tracked at the different scales examined. This means that the motion of precipitation parcels is a function of scale. Description of selected validation tools introduces the numerical analysis of obtained results pointing out limits and limited outcomes of algorithms.

  17. A new method for designing dual foil electron beam forming systems. II. Feasibility of practical implementation of the method

    NASA Astrophysics Data System (ADS)

    Adrich, Przemysław

    2016-05-01

    In Part I of this work a new method for designing dual foil electron beam forming systems was introduced. In this method, an optimal configuration of the dual foil system is found by means of a systematic, automatized scan of system performance in function of its parameters. At each point of the scan, Monte Carlo method is used to calculate the off-axis dose profile in water taking into account detailed and complete geometry of the system. The new method, while being computationally intensive, minimizes the involvement of the designer. In this Part II paper, feasibility of practical implementation of the new method is demonstrated. For this, a prototype software tools were developed and applied to solve a real life design problem. It is demonstrated that system optimization can be completed within few hours time using rather moderate computing resources. It is also demonstrated that, perhaps for the first time, the designer can gain deep insight into system behavior, such that the construction can be simultaneously optimized in respect to a number of functional characteristics besides the flatness of the off-axis dose profile. In the presented example, the system is optimized in respect to both, flatness of the off-axis dose profile and the beam transmission. A number of practical issues related to application of the new method as well as its possible extensions are discussed.

  18. Practical method for evaluating the sound field radiated from a waveguide.

    PubMed

    Feng, Xuelei; Shen, Yong; Chen, Simiao; Zhao, Ye

    2015-01-01

    This letter presents a simple and practical method for evaluating the sound field radiated from a waveguide. By using the proposed method, detailed information about the radiated sound field can be obtained by measuring the sound field in the mouth of the baffled waveguide. To examine this method's effectiveness, the radiated sound pressure distribution in space was first evaluated by using the proposed method, and then it was measured directly for comparison. Experiments using two different waveguides showed good agreement between the evaluated and the measured radiated sound pressure distributions. PMID:25618097

  19. Test versus analysis: A discussion of methods

    NASA Technical Reports Server (NTRS)

    Butler, T. G.

    1986-01-01

    Some techniques for comparing structural vibration data determined from test and analysis are discussed. Orthogonality is a general category of one group, correlation is a second, synthesis is a third and matrix improvement is a fourth. Advantages and short-comings of the methods are explored with suggestions as to how they can complement one another. The purpose for comparing vibration data from test and analysis for a given structure is to find out whether each is representing the dynamic properties of the structure in the same way. Specifically, whether: mode shapes are alike; the frequencies of the modes are alike; modes appear in the same frequency sequence; and if they are not alike, how to judge which to believe.

  20. Computational method for analysis of polyethylene biodegradation

    NASA Astrophysics Data System (ADS)

    Watanabe, Masaji; Kawai, Fusako; Shibata, Masaru; Yokoyama, Shigeo; Sudate, Yasuhiro

    2003-12-01

    In a previous study concerning the biodegradation of polyethylene, we proposed a mathematical model based on two primary factors: the direct consumption or absorption of small molecules and the successive weight loss of large molecules due to β-oxidation. Our model is an initial value problem consisting of a differential equation whose independent variable is time. Its unknown variable represents the total weight of all the polyethylene molecules that belong to a molecular-weight class specified by a parameter. In this paper, we describe a numerical technique to introduce experimental results into analysis of our model. We first establish its mathematical foundation in order to guarantee its validity, by showing that the initial value problem associated with the differential equation has a unique solution. Our computational technique is based on a linear system of differential equations derived from the original problem. We introduce some numerical results to illustrate our technique as a practical application of the linear approximation. In particular, we show how to solve the inverse problem to determine the consumption rate and the β-oxidation rate numerically, and illustrate our numerical technique by analyzing the GPC patterns of polyethylene wax obtained before and after 5 weeks cultivation of a fungus, Aspergillus sp. AK-3. A numerical simulation based on these degradation rates confirms that the primary factors of the polyethylene biodegradation posed in modeling are indeed appropriate.

  1. Optimisation of Lime-Soda process parameters for reduction of hardness in aqua-hatchery practices using Taguchi methods.

    PubMed

    Yavalkar, S P; Bhole, A G; Babu, P V Vijay; Prakash, Chandra

    2012-04-01

    This paper presents the optimisation of Lime-Soda process parameters for the reduction of hardness in aqua-hatchery practices in the context of M. rosenbergii. The fresh water in the development of fisheries needs to be of suitable quality. Lack of desirable quality in available fresh water is generally the confronting restraint. On the Indian subcontinent, groundwater is the only source of raw water, having varying degree of hardness and thus is unsuitable for the fresh water prawn hatchery practices (M. rosenbergii). In order to make use of hard water in the context of aqua-hatchery, Lime-Soda process has been recommended. The efficacy of the various process parameters like lime, soda ash and detention time, on the reduction of hardness needs to be examined. This paper proposes to determine the parameter settings for the CIFE well water, which is pretty hard by using Taguchi experimental design method. Orthogonal Arrays of Taguchi, Signal-to-Noise Ratio, the analysis of variance (ANOVA) have been applied to determine their dosage and analysed for their effect on hardness reduction. The tests carried out with optimal levels of Lime-Soda process parameters confirmed the efficacy of the Taguchi optimisation method. Emphasis has been placed on optimisation of chemical doses required to reduce the total hardness using Taguchi method and ANOVA, to suit the available raw water quality for aqua-hatchery practices, especially for fresh water prawn M. rosenbergii. PMID:24749379

  2. Thermal Analysis Methods for Aerobraking Heating

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.; Gasbarre, Joseph F.; Dec, John A.

    2005-01-01

    As NASA begins exploration of other planets, a method of non-propulsively slowing vehicles at the planet, aerobraking, may become a valuable technique for managing vehicle design mass and propellant. An example of this is Mars Reconnaissance Orbiter (MRO), which will launch in late 2005 and reach Mars in March of 2006. In order to save propellant, MRO will use aerobraking to modify the initial orbit at Mars. The spacecraft will dip into the atmosphere briefly on each orbit, and during the drag pass, the atmospheric drag on the spacecraft will slow it, thus lowering the orbit apoapsis. The largest area on the spacecraft, and that most affected by the heat generated during the aerobraking process, is the solar arrays. A thermal analysis of the solar arrays was conducted at NASA Langley, to simulate their performance throughout the entire roughly 6-month period of aerobraking. Several interesting methods were used to make this analysis more rapid and robust. Two separate models were built for this analysis, one in Thermal Desktop for radiation and orbital heating analysis, and one in MSC.Patran for thermal analysis. The results from the radiation model were mapped in an automated fashion to the Patran thermal model that was used to analyze the thermal behavior during the drag pass. A high degree of automation in file manipulation as well as other methods for reducing run time were employed, since toward the end of the aerobraking period the orbit period is short, and in order to support flight operations the runs must be computed rapidly. All heating within the Patran Thermal model was combined in one section of logic, such that data mapped from the radiation model and aeroheating model, as well as skin temperature effects on the aeroheating and surface radiation, could be incorporated easily. This approach calculates the aeroheating at any given node, based on its position and temperature as well as the density and velocity at that trajectory point. Run times on

  3. Numerical analysis method for linear induction machines.

    NASA Technical Reports Server (NTRS)

    Elliott, D. G.

    1972-01-01

    A numerical analysis method has been developed for linear induction machines such as liquid metal MHD pumps and generators and linear motors. Arbitrary phase currents or voltages can be specified and the moving conductor can have arbitrary velocity and conductivity variations from point to point. The moving conductor is divided into a mesh and coefficients are calculated for the voltage induced at each mesh point by unit current at every other mesh point. Combining the coefficients with the mesh resistances yields a set of simultaneous equations which are solved for the unknown currents.

  4. Method and apparatus for simultaneous spectroelectrochemical analysis

    DOEpatents

    Chatterjee, Sayandev; Bryan, Samuel A; Schroll, Cynthia A; Heineman, William R

    2013-11-19

    An apparatus and method of simultaneous spectroelectrochemical analysis is disclosed. A transparent surface is provided. An analyte solution on the transparent surface is contacted with a working electrode and at least one other electrode. Light from a light source is focused on either a surface of the working electrode or the analyte solution. The light reflected from either the surface of the working electrode or the analyte solution is detected. The potential of the working electrode is adjusted, and spectroscopic changes of the analyte solution that occur with changes in thermodynamic potentials are monitored.

  5. Apparatus and method for fluid analysis

    DOEpatents

    Wilson, Bary W.; Peters, Timothy J.; Shepard, Chester L.; Reeves, James H.

    2004-11-02

    The present invention is an apparatus and method for analyzing a fluid used in a machine or in an industrial process line. The apparatus has at least one meter placed proximate the machine or process line and in contact with the machine or process fluid for measuring at least one parameter related to the fluid. The at least one parameter is a standard laboratory analysis parameter. The at least one meter includes but is not limited to viscometer, element meter, optical meter, particulate meter, and combinations thereof.

  6. Apparatus And Method For Fluid Analysis

    DOEpatents

    Wilson, Bary W.; Peters, Timothy J.; Shepard, Chester L.; Reeves, James H.

    2003-05-13

    The present invention is an apparatus and method for analyzing a fluid used in a machine or in an industrial process line. The apparatus has at least one meter placed proximate the machine or process line and in contact with the machine or process fluid for measuring at least one parameter related to the fluid. The at least one parameter is a standard laboratory analysis parameter. The at least one meter includes but is not limited to viscometer, element meter, optical meter, particulate meter, and combinations thereof.

  7. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  8. Scanning methods applied to bitemark analysis

    NASA Astrophysics Data System (ADS)

    Bush, Peter J.; Bush, Mary A.

    2010-06-01

    The 2009 National Academy of Sciences report on forensics focused criticism on pattern evidence subdisciplines in which statements of unique identity are utilized. One principle of bitemark analysis is that the human dentition is unique to the extent that a perpetrator may be identified based on dental traits in a bitemark. Optical and electron scanning methods were used to measure dental minutia and to investigate replication of detail in human skin. Results indicated that being a visco-elastic substrate, skin effectively reduces the resolution of measurement of dental detail. Conclusions indicate caution in individualization statements.

  9. Semiquantitative fluorescence method for bioconjugation analysis.

    PubMed

    Brasil, Aluízio G; Carvalho, Kilmara H G; Leite, Elisa S; Fontes, Adriana; Santos, Beate Saegesser

    2014-01-01

    Quantum dots (QDs) have been used as fluorescent probes in biological and medical fields such as bioimaging, bioanalytical, and immunofluorescence assays. For these applications, it is important to characterize the QD-protein bioconjugates. This chapter provides details on a versatile method to confirm quantum dot-protein conjugation including the required materials and instrumentation in order to perform the step-by-step semiquantitative analysis of the bioconjugation efficiency by using fluorescence plate readings. Although the protocols to confirm the QD-protein attachment shown here were developed for CdTe QDs coated with specific ligands and proteins, the principles are the same for other QDs-protein bioconjugates. PMID:25103803

  10. Kaizen practice in healthcare: a qualitative analysis of hospital employees' suggestions for improvement

    PubMed Central

    Mazzocato, Pamela; Stenfors-Hayes, Terese; von Thiele Schwarz, Ulrica; Hasson, Henna

    2016-01-01

    Objectives Kaizen, or continuous improvement, lies at the core of lean. Kaizen is implemented through practices that enable employees to propose ideas for improvement and solve problems. The aim of this study is to describe the types of issues and improvement suggestions that hospital employees feel empowered to address through kaizen practices in order to understand when and how kaizen is used in healthcare. Methods We analysed 186 structured kaizen documents containing improvement suggestions that were produced by 165 employees at a Swedish hospital. Directed content analysis was used to categorise the suggestions into following categories: type of situation (proactive or reactive) triggering an action; type of process addressed (technical/administrative, support and clinical); complexity level (simple or complex); and type of outcomes aimed for (operational or sociotechnical). Compliance to the kaizen template was calculated. Results 72% of the improvement suggestions were reactions to a perceived problem. Support, technical and administrative, and primary clinical processes were involved in 47%, 38% and 16% of the suggestions, respectively. The majority of the kaizen documents addressed simple situations and focused on operational outcomes. The degree of compliance to the kaizen template was high for several items concerning the identification of problems and the proposed solutions, and low for items related to the test and implementation of solutions. Conclusions There is a need to combine kaizen practices with improvement and innovation practices that help staff and managers to address complex issues, such as the improvement of clinical care processes. The limited focus on sociotechnical aspects and the partial compliance to kaizen templates may indicate a limited understanding of the entire kaizen process and of how it relates to the overall organisational goals. This in turn can hamper the sustainability of kaizen practices and results. PMID:27473953

  11. Selective spectroscopic methods for water analysis

    SciTech Connect

    Vaidya, B.

    1997-06-24

    This dissertation explores in large part the development of a few types of spectroscopic methods in the analysis of water. Methods for the determination of some of the most important properties of water like pH, metal ion content, and chemical oxygen demand are investigated in detail. This report contains a general introduction to the subject and the conclusions. Four chapters and an appendix have been processed separately. They are: chromogenic and fluorogenic crown ether compounds for the selective extraction and determination of Hg(II); selective determination of cadmium in water using a chromogenic crown ether in a mixed micellar solution; reduction of chloride interference in chemical oxygen demand determination without using mercury salts; structural orientation patterns for a series of anthraquinone sulfonates adsorbed at an aminophenol thiolate monolayer chemisorbed at gold; and the role of chemically modified surfaces in the construction of miniaturized analytical instrumentation.

  12. International Commercial Remote Sensing Practices and Policies: A Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Stryker, Timothy

    In recent years, there has been much discussion about U.S. commercial remoteUnder the Act, the Secretary of Commerce sensing policies and how effectively theylicenses the operations of private U.S. address U.S. national security, foreignremote sensing satellite systems, in policy, commercial, and public interests.consultation with the Secretaries of Defense, This paper will provide an overview of U.S.State, and Interior. PDD-23 provided further commercial remote sensing laws,details concerning the operation of advanced regulations, and policies, and describe recentsystems, as well as criteria for the export of NOAA initiatives. It will also addressturnkey systems and/or components. In July related foreign practices, and the overall2000, pursuant to the authority delegated to legal context for trade and investment in thisit by the Secretary of Commerce, NOAA critical industry.iss ued new regulations for the industry. Licensing and Regulationsatellite systems. NOAA's program is The 1992 Land Remote Sensing Policy Act ("the Act"), and the 1994 policy on Foreign Access to Remote Sensing Space Capabilities (known as Presidential Decision Directive-23, or PDD-23) put into place an ambitious legal and policy framework for the U.S. Government's licensing of privately-owned, high-resolution satellite systems. Previously, capabilities afforded national security and observes the international obligations of the United States; maintain positive control of spacecraft operations; maintain a tasking record in conjunction with other record-keeping requirements; provide U.S. Government access to and use of data when required for national security or foreign policy purposes; provide for U.S. Government review of all significant foreign agreements; obtain U.S. Government approval for any encryption devices used; make available unenhanced data to a "sensed state" as soon as such data are available and on reasonable cost terms and conditions; make available unenhanced data as requested

  13. A Simplified Mesh Deformation Method Using Commercial Structural Analysis Software

    NASA Technical Reports Server (NTRS)

    Hsu, Su-Yuen; Chang, Chau-Lyan; Samareh, Jamshid

    2004-01-01

    Mesh deformation in response to redefined or moving aerodynamic surface geometries is a frequently encountered task in many applications. Most existing methods are either mathematically too complex or computationally too expensive for usage in practical design and optimization. We propose a simplified mesh deformation method based on linear elastic finite element analyses that can be easily implemented by using commercially available structural analysis software. Using a prescribed displacement at the mesh boundaries, a simple structural analysis is constructed based on a spatially varying Young s modulus to move the entire mesh in accordance with the surface geometry redefinitions. A variety of surface movements, such as translation, rotation, or incremental surface reshaping that often takes place in an optimization procedure, may be handled by the present method. We describe the numerical formulation and implementation using the NASTRAN software in this paper. The use of commercial software bypasses tedious reimplementation and takes advantage of the computational efficiency offered by the vendor. A two-dimensional airfoil mesh and a three-dimensional aircraft mesh were used as test cases to demonstrate the effectiveness of the proposed method. Euler and Navier-Stokes calculations were performed for the deformed two-dimensional meshes.

  14. A Mixed Methods Content Analysis of the Research Literature in Science Education

    ERIC Educational Resources Information Center

    Schram, Asta B.

    2014-01-01

    In recent years, more and more researchers in science education have been turning to the practice of combining qualitative and quantitative methods in the same study. This approach of using mixed methods creates possibilities to study the various issues that science educators encounter in more depth. In this content analysis, I evaluated 18…

  15. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 1 2011-04-01 2011-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... enforcement programs to utilize the methods of analysis of the AOAC INTERNATIONAL (AOAC) as published in...

  16. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... enforcement programs to utilize the methods of analysis of the AOAC INTERNATIONAL (AOAC) as published in...

  17. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 1 2014-04-01 2014-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... enforcement programs to utilize the methods of analysis of the AOAC INTERNATIONAL (AOAC) as published in...

  18. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 1 2012-04-01 2012-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... enforcement programs to utilize the methods of analysis of the AOAC INTERNATIONAL (AOAC) as published in...

  19. Text analysis devices, articles of manufacture, and text analysis methods

    DOEpatents

    Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C

    2015-03-31

    Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes a display configured to depict visible images, and processing circuitry coupled with the display and wherein the processing circuitry is configured to access a first vector of a text item and which comprises a plurality of components, to access a second vector of the text item and which comprises a plurality of components, to weight the components of the first vector providing a plurality of weighted values, to weight the components of the second vector providing a plurality of weighted values, and to combine the weighted values of the first vector with the weighted values of the second vector to provide a third vector.

  20. A rapid method for airborne tritium analysis

    SciTech Connect

    Hofstetter, K.J.; Wilson, H.T. )

    1991-11-01

    Tritium is one of the principal radionuclides released to the environment from nuclear fuel and target reprocessing, heavy-water production, and other nuclear industry operations. For example, the majority of the off-site dose to the public at the Savannah River site (SRS) in 1988 was from tritium oxide (HTO). The absorbed dose is highly dependent on chemical form; HTO is 10,000 more hazardous than the elemental form (HT). Commercially available tritium monitors do not discriminate between chemical form and have high detection limits. Consequently, tedious laboratory methods must be used to analyze HTO in air. Desiccants are used to remove all the water from an air sample. The tritiated water is then desorbed and analyzed by liquid scintillation spectrometry. The method is complex and takes several hours to complete. During an unplanned release, present-time atmospheric tritium concentrations are never available. To improve emergency response capabilities, a rapid sampling and analysis method was developed for measuring low-level HTO concentrations in air. Standard desiccant sampling and water desorption procedure was modified for use in the SRS mobile laboratory, which is equipped with a liquid scintillation counter. These tests indicate that an HTO concentration of 0.2% DCG (7 Bq/m{sup 3}) can be detected by this method with a 10-min sample collection time and a 10-min count.

  1. Stirling Analysis Comparison of Commercial vs. High-Order Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako

    2007-01-01

    Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/ proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's Compact scheme and Dyson s Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model although sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

  2. Stirling Analysis Comparison of Commercial Versus High-Order Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako

    2005-01-01

    Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's compact scheme and Dyson's Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model with sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

  3. The Frankfurt Patient Safety Climate Questionnaire for General Practices (FraSiK): analysis of psychometric properties.

    PubMed

    Hoffmann, Barbara; Domanska, Olga Maria; Albay, Zeycan; Mueller, Vera; Guethlin, Corina; Thomas, Eric J; Gerlach, Ferdinand M

    2011-09-01

    BACKGROUND Safety culture has been identified as having a major impact on how safety is managed in healthcare. However, it has not received much attention in general practices. Hence, no instrument yet exists to assess safety climate-the measurable artefact of safety culture-in this setting. This study aims to evaluate psychometric properties of a newly developed safety climate questionnaire for use in German general practices. METHODS The existing Safety Attitudes Questionnaire, Ambulatory Version, was considerably modified and enhanced in order to be applicable in general practice. After pilot tests and its application in a random sample of 400 German practices, a first psychometric analysis led to modifications in several items. A further psychometric analysis was conducted with an additional sample of 60 practices and a response rate of 97.08%. Exploratory factor analysis with orthogonal varimax rotation was carried out and the internal consistency of the identified factors was calculated. RESULTS Nine factors emerged, representing a wide range of dimensions associated with safety culture: teamwork climate, error management, safety of clinical processes, perception of causes of errors, job satisfaction, safety of office structure, receptiveness to healthcare assistants and patients, staff perception of management, and quality and safety of medical care. Internal consistency of factors is moderate to good. CONCLUSIONS This study demonstrates the development of a patient safety climate instrument. The questionnaire displays established features of safety climate and additionally contains features that might be specific to small-scale general practices. PMID:21571753

  4. Analysis of newly proposed setpoint methods

    SciTech Connect

    Hines, J. W.; Miller, D. W.; Arndt, S. A.

    2006-07-01

    A new methodology for evaluating the operability of safety critical instrumentation has been proposed. Common to the prior method, a limiting trip setpoint (LSP) is determined to protect the analytical limit by considering uncertainties inherent in the measurement process. Channel operability is assured by periodically performing a channel operability test (COT) which compares the as-found trip point to the previous as-left trip point and evaluates the deviation. Licensees can include an additional conservative margin which results in a nominal trip setpoint (NSP) versus the LSP. If the setting tolerance is small as compared to the deviation limit, an alternate operability test can be applied that compares the as-found trip point to the LSP (or NSP as applicable) rather than the as-left trip setpoint. This method does not provide the actual channel deviation for operability determination so a penalty term may be appropriate. This paper provides an analysis of the alternate channel operability test and provides recommendations for setting a penalty term to reduce the non-conservativeness of the alternate channel operability test to a pre-defined value so as to preserve the required confidence level of the uncertainty analysis. (authors)

  5. Transits of Extrasolar Planets and Analysis Methods

    NASA Astrophysics Data System (ADS)

    Fritchman, Joseph

    2007-10-01

    Using Wittenberg's 10-inch refracting telescope housed in Elgar Weaver Observatory, and an ST-8XE CCD camera, the egress of the transit of planet HD209458 `b' was observed on the night of December 18^th, 2006. This transit occurs when the planet passes directly between its host star and the telescope on Earth, and the brightness of the star decreases by about 1.5%. The brightness of the stars is measured by the number of counts in pixels in images taken as 30 second exposures over a period of 64 minutes. Data analysis techniques using Diffraction Limited's MaxImDL^TM yield a standard deviation of less than .004 magnitudes using a sliding box averaging method. This means that a change in brightness can be measured of about .4% and much dimmer transits of other planets may be recorded from this telescope. Analysis methods using MathWork's MATLAB^ are being developed to gain more control over how pixels are combined to determine the brightness of stars and more effective modes of combining images.

  6. PRACTICAL SENSITIVITY AND UNCERTAINTY ANALYSIS TECHNIQUES APPLIED TO AGRICULTURAL SYSTEMS MODELS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We present a practical evaluation framework for analysis of two complex, process-based agricultural system models, WEPP and RZWQM. The evaluation framework combines sensitivity analysis and the uncertainty analysis techniques of first order error analysis (FOA) and Monte Carlo simulation with Latin ...

  7. Statistical methods for the forensic analysis of striated tool marks

    SciTech Connect

    Hoeksema, Amy Beth

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  8. An Analysis of Inservice Education Practices for Hospital Laboratory Personnel.

    ERIC Educational Resources Information Center

    Bonke, Barbara A.; And Others

    1988-01-01

    A study looked at inservice practices in clinical laboratories and at managers' perceptions of the impact and cost effectiveness of those activities. Findings indicate that most do not have an inservice budget and that new employee orientation, policy and procedure discussion, and instrumentation instruction are most effective. (JOW)

  9. An Analysis of Teacher Practices with Toddlers during Social Conflicts

    ERIC Educational Resources Information Center

    Gloeckler, Lissy R.; Cassell, Jennifer M.; Malkus, Amy J.

    2014-01-01

    Employing a quasi-experimental design, this pilot study on teacher practices with toddlers during social conflicts was conducted in the southeastern USA. Four child-care classrooms, teachers (n?=?8) and children (n?=?51) were assessed with the Classroom Assessment Scoring System -- Toddler [CLASS-Toddler; La Paro, K., Hamre, B. K., & Pianta,…

  10. Professional Learning in Rural Practice: A Sociomaterial Analysis

    ERIC Educational Resources Information Center

    Slade, Bonnie

    2013-01-01

    Purpose: This paper aims to examine the professional learning of rural police officers. Design/methodology/approach: This qualitative case study involved interviews and focus groups with 34 police officers in Northern Scotland. The interviews and focus groups were transcribed and analysed, drawing on practice-based and sociomaterial learning…

  11. Honesty in Critically Reflective Essays: An Analysis of Student Practice

    ERIC Educational Resources Information Center

    Maloney, Stephen; Tai, Joanna Hong-Meng; Lo, Kristin; Molloy, Elizabeth; Ilic, Dragan

    2013-01-01

    In health professional education, reflective practice is seen as a potential means for self-improvement from everyday clinical encounters. This study aims to examine the level of student honesty in critical reflection, and barriers and facilitators for students engaging in honest reflection. Third year physiotherapy students, completing summative…

  12. Incorporating Computer-Aided Language Sample Analysis into Clinical Practice

    ERIC Educational Resources Information Center

    Price, Lisa Hammett; Hendricks, Sean; Cook, Colleen

    2010-01-01

    Purpose: During the evaluation of language abilities, the needs of the child are best served when multiple types and sources of data are included in the evaluation process. Current educational policies and practice guidelines further dictate the use of authentic assessment data to inform diagnosis and treatment planning. Language sampling and…

  13. Chapter 11. Community analysis-based methods

    SciTech Connect

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  14. A concise method for mine soils analysis

    SciTech Connect

    Winkler, S.; Wildeman, T.; Robinson, R.; Herron, J.

    1999-07-01

    A large number of abandoned hard rock mines exist in Colorado and other mountain west states, many on public property. Public pressure and resulting policy changes have become a driving force in the reclamation of these sites. Two of the key reclamation issues for these sites in the occurrence of acid forming materials (AFMs) in mine soils, and acid mine drainage (AMD) issuing from mine audits. An AMD treatment system design project for the Forest Queen mine in Colorado's San Juan mountains raised the need for a simple, useable method for analysis of mine land soils, both for suitability as a construction material, and to determine the AFM content and potential for acid release. The authors have developed a simple, stepwise, go - no go test for the analysis of mine soils. Samples were collected from a variety of sites in the Silverton, CO area, and subjected to three tiers of tests including: paste pH, Eh, and 10% HCl fizz test; then total digestion in HNO{sub 3}/HCl, neutralization potential, exposure to meteoric water, and toxicity content leaching procedure (TCLP). All elemental analyses were performed with an inductively-coupled plasma (ICP) spectrometer. Elimination of samples via the first two testing tiers left two remaining samples, which were subsequently subjected to column and sequential batch tests, with further elemental analysis by ICP. Based on these tests, one sample was chosen for suitability as a constructing material for the Forest Queen treatment system basins. Further simplification, and testing on two pairs of independent soil samples, has resulted in a final analytical method suitable for general use.

  15. Green analytical method development for statin analysis.

    PubMed

    Assassi, Amira Louiza; Roy, Claude-Eric; Perovitch, Philippe; Auzerie, Jack; Hamon, Tiphaine; Gaudin, Karen

    2015-02-01

    Green analytical chemistry method was developed for pravastatin, fluvastatin and atorvastatin analysis. HPLC/DAD method using ethanol-based mobile phase with octadecyl-grafted silica with various grafting and related-column parameters such as particle sizes, core-shell and monolith was studied. Retention, efficiency and detector linearity were optimized. Even for column with particle size under 2 μm, the benefit of keeping efficiency within a large range of flow rate was not obtained with ethanol based mobile phase compared to acetonitrile one. Therefore the strategy to shorten analysis by increasing the flow rate induced decrease of efficiency with ethanol based mobile phase. An ODS-AQ YMC column, 50 mm × 4.6 mm, 3 μm was selected which showed the best compromise between analysis time, statin separation, and efficiency. HPLC conditions were at 1 mL/min, ethanol/formic acid (pH 2.5, 25 mM) (50:50, v/v) and thermostated at 40°C. To reduce solvent consumption for sample preparation, 0.5mg/mL concentration of each statin was found the highest which respected detector linearity. These conditions were validated for each statin for content determination in high concentrated hydro-alcoholic solutions. Solubility higher than 100mg/mL was found for pravastatin and fluvastatin, whereas for atorvastatin calcium salt the maximum concentration was 2mg/mL for hydro-alcoholic binary mixtures between 35% and 55% of ethanol in water. Using atorvastatin instead of its calcium salt, solubility was improved. Highly concentrated solution of statins offered potential fluid for per Buccal Per-Mucous(®) administration with the advantages of rapid and easy passage of drugs. PMID:25582487

  16. Analysis of methods. [information systems evolution environment

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity

  17. Meaning and challenges in the practice of multiple therapeutic massage modalities: a combined methods study

    PubMed Central

    2011-01-01

    Background Therapeutic massage and bodywork (TMB) practitioners are predominantly trained in programs that are not uniformly standardized, and in variable combinations of therapies. To date no studies have explored this variability in training and how this affects clinical practice. Methods Combined methods, consisting of a quantitative, population-based survey and qualitative interviews with practitioners trained in multiple therapies, were used to explore the training and practice of TMB practitioners in Alberta, Canada. Results Of the 5242 distributed surveys, 791 were returned (15.1%). Practitioners were predominantly female (91.7%), worked in a range of environments, primarily private (44.4%) and home clinics (35.4%), and were not significantly different from other surveyed massage therapist populations. Seventy-seven distinct TMB therapies were identified. Most practitioners were trained in two or more therapies (94.4%), with a median of 8 and range of 40 therapies. Training programs varied widely in number and type of TMB components, training length, or both. Nineteen interviews were conducted. Participants described highly variable training backgrounds, resulting in practitioners learning unique combinations of therapy techniques. All practitioners reported providing individualized patient treatment based on a responsive feedback process throughout practice that they described as being critical to appropriately address the needs of patients. They also felt that research treatment protocols were different from clinical practice because researchers do not usually sufficiently acknowledge the individualized nature of TMB care provision. Conclusions The training received, the number of therapies trained in, and the practice descriptors of TMB practitioners are all highly variable. In addition, clinical experience and continuing education may further alter or enhance treatment techniques. Practitioners individualize each patient's treatment through a highly

  18. Practical guidelines to select and scale earthquake records for nonlinear response history analysis of structures

    USGS Publications Warehouse

    Kalkan, Erol; Chopra, Anil K.

    2010-01-01

    Earthquake engineering practice is increasingly using nonlinear response history analysis (RHA) to demonstrate performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. Presented herein is a modal-pushover-based scaling (MPS) method to scale ground motions for use in nonlinear RHA of buildings and bridges. In the MPS method, the ground motions are scaled to match (to a specified tolerance) a target value of the inelastic deformation of the first-'mode' inelastic single-degree-of-freedom (SDF) system whose properties are determined by first-'mode' pushover analysis. Appropriate for first-?mode? dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-'mode' SDF system in selecting a subset of the scaled ground motions. Based on results presented for two bridges, covering single- and multi-span 'ordinary standard' bridge types, and six buildings, covering low-, mid-, and tall building types in California, the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.

  19. Practical hyperdynamics method for systems with large changes in potential energy.

    PubMed

    Hirai, Hirotoshi

    2014-12-21

    A practical hyperdynamics method is proposed to accelerate systems with highly endothermic and exothermic reactions such as hydrocarbon pyrolysis and oxidation reactions. In this method, referred to as the "adaptive hyperdynamics (AHD) method," the bias potential parameters are adaptively updated according to the change in potential energy. The approach is intensively examined for JP-10 (exo-tetrahydrodicyclopentadiene) pyrolysis simulations using the ReaxFF reactive force field. Valid boost parameter ranges are clarified as a result. It is shown that AHD can be used to model pyrolysis at temperatures as low as 1000 K while achieving a boost factor of around 10(5). PMID:25527921

  20. Practical hyperdynamics method for systems with large changes in potential energy

    NASA Astrophysics Data System (ADS)

    Hirai, Hirotoshi

    2014-12-01

    A practical hyperdynamics method is proposed to accelerate systems with highly endothermic and exothermic reactions such as hydrocarbon pyrolysis and oxidation reactions. In this method, referred to as the "adaptive hyperdynamics (AHD) method," the bias potential parameters are adaptively updated according to the change in potential energy. The approach is intensively examined for JP-10 (exo-tetrahydrodicyclopentadiene) pyrolysis simulations using the ReaxFF reactive force field. Valid boost parameter ranges are clarified as a result. It is shown that AHD can be used to model pyrolysis at temperatures as low as 1000 K while achieving a boost factor of around 105.

  1. Entropy Methods For Univariate Distributions in Decision Analysis

    NASA Astrophysics Data System (ADS)

    Abbas, Ali E.

    2003-03-01

    One of the most important steps in decision analysis practice is the elicitation of the decision-maker's belief about an uncertainty of interest in the form of a representative probability distribution. However, the probability elicitation process is a task that involves many cognitive and motivational biases. Alternatively, the decision-maker may provide other information about the distribution of interest, such as its moments, and the maximum entropy method can be used to obtain a full distribution subject to the given moment constraints. In practice however, decision makers cannot readily provide moments for the distribution, and are much more comfortable providing information about the fractiles of the distribution of interest or bounds on its cumulative probabilities. In this paper we present a graphical method to determine the maximum entropy distribution between upper and lower probability bounds and provide an interpretation for the shape of the maximum entropy distribution subject to fractile constraints, (FMED). We also discuss the problems with the FMED in that it is discontinuous and flat over each fractile interval. We present a heuristic approximation to a distribution if in addition to its fractiles, we also know it is continuous and work through full examples to illustrate the approach.

  2. Influence of analysis methods on interpretation of hazard maps.

    PubMed

    Koehler, Kirsten A; Peters, Thomas M

    2013-06-01

    Exposure or hazard mapping is becoming increasingly popular among industrial hygienists. Direct-reading instruments used for hazard mapping of data collection are steadily increasing in reliability and portability while decreasing in cost. Exposure measurements made with these instruments generally require no laboratory analysis although hazard mapping can be a time-consuming process. To inform decision making by industrial hygienists and management, it is crucial that the maps generated from mapping data are as accurate and representative as possible. Currently, it is unclear how many sampling locations are necessary to produce a representative hazard map. As such, researchers typically collect as many points as can be sampled in several hours and interpolation methods are used to produce higher resolution maps. We have reanalyzed hazard-mapping data sets from three industrial settings to determine which interpolation methods yield the most accurate results. The goal is to provide practicing industrial hygienists with some practical guidelines to generate accurate hazard maps with 'off-the-shelf' mapping software. Visually verifying the fit of the variogram model is crucial for accurate interpolation. Exponential and spherical variogram models performed better than Gaussian models. It was also necessary to diverge from some of the default interpolation parameters such as the number of bins used for the experimental variogram and whether or not to allow for a nugget effect to achieve reasonable accuracy of the interpolation for some data sets. PMID:23258453

  3. A Comparative Analysis of Ethnomedicinal Practices for Treating Gastrointestinal Disorders Used by Communities Living in Three National Parks (Korea)

    PubMed Central

    Kim, Hyun; Song, Mi-Jang; Brian, Heldenbrand; Choi, Kyoungho

    2014-01-01

    The purpose of this study is to comparatively analyze the ethnomedicinal practices on gastrointestinal disorders within communities in Jirisan National Park, Gayasan National Park, and Hallasan National Park of Korea. Data was collected through participant observations and indepth interviews with semistructured questionnaires. Methods for comparative analysis were accomplished using the informant consensus factor, fidelity level, and internetwork analysis. A total of 490 ethnomedicinal practices recorded from the communities were classified into 110 families, 176 genera, and 220 species that included plants, animals, fungi, and alga. The informant consensus factor values in the disorder categories were enteritis, and gastralgia (1.0), followed by indigestion (0.94), constipation (0.93), and abdominal pain and gastroenteric trouble (0.92). In terms of fidelity levels, 71 plant species showed fidelity levels of 100%. The internetwork analysis between disorders and all medicinal species are grouped in the center by the four categories of indigestion, diarrhea, abdominal pain, and gastroenteric trouble, respectively. Regarding the research method of this study, the comparative analysis methods will contribute to the availability of orally transmitted ethnomedicinal knowledge. Among the methods of analysis, the use of internetwork analysis as a tool for analysis in this study provides imperative internetwork maps between gastrointestinal disorders and medicinal species. PMID:25202330

  4. A cross-sectional mixed methods study protocol to generate learning from patient safety incidents reported from general practice

    PubMed Central

    Carson-Stevens, Andrew; Hibbert, Peter; Avery, Anthony; Butlin, Amy; Carter, Ben; Cooper, Alison; Evans, Huw Prosser; Gibson, Russell; Luff, Donna; Makeham, Meredith; McEnhill, Paul; Panesar, Sukhmeet S; Parry, Gareth; Rees, Philippa; Shiels, Emma; Sheikh, Aziz; Ward, Hope Olivia; Williams, Huw; Wood, Fiona; Donaldson, Liam; Edwards, Adrian

    2015-01-01

    Introduction Incident reports contain descriptions of errors and harms that occurred during clinical care delivery. Few observational studies have characterised incidents from general practice, and none of these have been from the England and Wales National Reporting and Learning System. This study aims to describe incidents reported from a general practice care setting. Methods and analysis A general practice patient safety incident classification will be developed to characterise patient safety incidents. A weighted-random sample of 12 500 incidents describing no harm, low harm and moderate harm of patients, and all incidents describing severe harm and death of patients will be classified. Insights from exploratory descriptive statistics and thematic analysis will be combined to identify priority areas for future interventions. Ethics and dissemination The need for ethical approval was waivered by the Aneurin Bevan University Health Board research risk review committee given the anonymised nature of data (ABHB R&D Ref number: SA/410/13). The authors will submit the results of the study to relevant journals and undertake national and international oral presentations to researchers, clinicians and policymakers. PMID:26628526

  5. Analysis Method for Quantifying Vehicle Design Goals

    NASA Technical Reports Server (NTRS)

    Fimognari, Peter; Eskridge, Richard; Martin, Adam; Lee, Michael

    2007-01-01

    A document discusses a method for using Design Structure Matrices (DSM), coupled with high-level tools representing important life-cycle parameters, to comprehensively conceptualize a flight/ground space transportation system design by dealing with such variables as performance, up-front costs, downstream operations costs, and reliability. This approach also weighs operational approaches based on their effect on upstream design variables so that it is possible to readily, yet defensively, establish linkages between operations and these upstream variables. To avoid the large range of problems that have defeated previous methods of dealing with the complex problems of transportation design, and to cut down the inefficient use of resources, the method described in the document identifies those areas that are of sufficient promise and that provide a higher grade of analysis for those issues, as well as the linkages at issue between operations and other factors. Ultimately, the system is designed to save resources and time, and allows for the evolution of operable space transportation system technology, and design and conceptual system approach targets.

  6. Multi-Spacecraft Turbulence Analysis Methods

    NASA Astrophysics Data System (ADS)

    Horbury, Tim S.; Osman, Kareem T.

    Turbulence is ubiquitous in space plasmas, from the solar wind to supernova remnants, and on scales from the electron gyroradius to interstellar separations. Turbulence is responsible for transporting energy across space and between scales and plays a key role in plasma heating, particle acceleration and thermalisation downstream of shocks. Just as with other plasma processes such as shocks or reconnection, turbulence results in complex, structured and time-varying behaviour which is hard to measure with a single spacecraft. However, turbulence is a particularly hard phenomenon to study because it is usually broadband in nature: it covers many scales simultaneously. One must therefore use techniques to extract information on multiple scales in order to quantify plasma turbulence and its effects. The Cluster orbit takes the spacecraft through turbulent regions with a range of characteristics: the solar wind, magnetosheath, cusp and magnetosphere. In each, the nature of the turbulence (strongly driven or fully evolved; dominated by kinetic effects or largely on fluid scales), as well as characteristics of the medium (thermalised or not; high or low plasma sub- or super-Alfvenic) mean that particular techniques are better suited to the analysis of Cluster data in different locations. In this chapter, we consider a range of methods and how they are best applied to these different regions. Perhaps the most studied turbulent space plasma environment is the solar wind, see Bruno and Carbone [2005]; Goldstein et al. [2005] for recent reviews. This is the case for a number of reasons: it is scientifically important for cosmic ray and solar energetic particle scattering and propagation, for example. However, perhaps the most significant motivations for studying solar wind turbulence are pragmatic: large volumes of high quality measurements are available; the stability of the solar wind on the scales of hours makes it possible to identify statistically stationary intervals to

  7. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  8. Comparing the Effect of Concept Mapping and Conventional Methods on Nursing Students’ Practical Skill Score

    PubMed Central

    Rasoul Zadeh, Nasrin; Sadeghi Gandomani, Hamidreza; Delaram, Masoumeh; Parsa Yekta, Zohre

    2015-01-01

    Background: Development of practical skills in the field of nursing education has remained a serious and considerable challenge in nursing education. Moreover, newly graduated nurses may have weak practical skills, which can be a threat to patients’ safety. Objectives: The present study was conducted to compare the effect of concept mapping and conventional methods on nursing students’ practical skills. Patients and Methods: This quasi-experimental study was conducted on 70 nursing students randomly assigned into two groups of 35 people. The intervention group was taught through concept mapping method, while the control group was taught using conventional method. A two-part instrument was used including a demographic information form and a checklist for direct observation of procedural skills. Descriptive statistics, chi-square, independent samples t-tests and paired t-test were used to analyze data. Results: Before education, no significant differences were observed between the two groups in the three skills of cleaning (P = 0.251), injection (P = 0.185) and sterilizing (P = 0.568). The students mean scores were significantly increased after the education and the difference between pre and post intervention of students mean scores were significant in the both groups (P < 0.001). However, after education, in all three skills the mean scores of the intervention group were significantly higher than the control group (P < 0.001). Conclusions: Concept mapping was superior to conventional skill teaching methods. It is suggested to use concept mapping in teaching practical courses such as fundamentals of nursing. PMID:26576441

  9. Developing a preliminary ‘never event’ list for general practice using consensus-building methods

    PubMed Central

    de Wet, Carl; O’Donnell, Catherine; Bowie, Paul

    2014-01-01

    Background The ‘never event’ concept has been implemented in many acute hospital settings to help prevent serious patient safety incidents. Benefits include increasing awareness of highly important patient safety risks among the healthcare workforce, promoting proactive implementation of preventive measures, and facilitating incident reporting. Aim To develop a preliminary list of never events for general practice. Design and setting Application of a range of consensus-building methods in Scottish and UK general practices. Method A total of 345 general practice team members suggested potential never events. Next, ‘informed’ staff (n =15) developed criteria for defining never events and applied the criteria to create a list of candidate never events. Finally, UK primary care patient safety ‘experts’ (n = 17) reviewed, refined, and validated a preliminary list via a modified Delphi group and by completing a content validity index exercise. Results There were 721 written suggestions received as potential never events. Thematic categorisation reduced this to 38. Five criteria specific to general practice were developed and applied to produce 11 candidate never events. The expert group endorsed a preliminary list of 10 items with a content validity index (CVI) score of >80%. Conclusion A preliminary list of never events was developed for general practice through practitioner experience and consensus-building methods. This is an important first step to determine the potential value of the never event concept in this setting. It is now intended to undertake further testing of this preliminary list to assess its acceptability, feasibility, and potential usefulness as a safety improvement intervention. PMID:24567655

  10. Searching Usenet for Virtual Communities of Practice: Using Mixed Methods to Identify the Constructs of Wenger's Theory

    ERIC Educational Resources Information Center

    Murillo, Enrique

    2008-01-01

    Introduction: This research set out to determine whether communities of practice can be entirely Internet-based by formally applying Wenger's theoretical framework to Internet collectives. Method: A model of a virtual community of practice was developed which included the constructs Wenger identified in co-located communities of practice: mutual…

  11. A Method of Streamflow Drought Analysis

    NASA Astrophysics Data System (ADS)

    Zelenhasić, Emir; Salvai, Atila

    1987-01-01

    A method of completely describing and analyzing the stochastic process of streamflow droughts has been recommended. All important components of streamflow droughts such as deficit, duration, time of occurrence, number of streamflow droughts in a given time interval [0, t], the largest streamflow drought deficit, and the largest streamflow drought duration in a given time interval [0, t] are taken into consideration. A streamflow drought is related here to streamflow deficit. Following the theory of the supremum of a random number of random variables a stochastic model is presented for interpretation and analysis of the largest streamflow drought deficit below a given reference discharge and the largest streamflow drought duration concerning a time interval [0, t], at a given location of a river. The method is based on the assumption that streamflow droughts are independent, identically distributed random variables and that their occurrence is subject to the Poisson probability law. This paper is actually a continuation of the previous E. Zelenhasić (1970, 1979, 1983) and P. Todorović (1970) works on the extremes in hydrology. Application of the method is made on the 58-year record of the Sava River at Sr. Mitrovica and on the 52-year record of Tisa River at Senta, Yugoslavia, and good agreement is found between the theoretical and empirical distribution functions for all analyzed drought components for both rivers. Only one complete example, the Sava River at Sr. Mitrovica, is given in the paper. The proposed method deals with hydrograph recessions of daily or instantaneous discharges in the region of low flows, and not with mean annual flows which were used by other investigators.

  12. Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis

    PubMed Central

    Critchfield, Thomas S

    2011-01-01

    Neither practitioners nor scientists appear to be fully satisfied with the world's largest behavior-analytic membership organization. Each community appears to believe that initiatives that serve the other will undermine the association's capacity to serve their own needs. Historical examples suggest that such discord is predicted when practitioners and scientists cohabit the same association. This is true because all professional associations exist to address guild interests, and practice and science are different professions with different guild interests. No association, therefore, can succeed in being all things to all people. The solution is to assure that practice and science communities are well served by separate professional associations. I comment briefly on how this outcome might be promoted. PMID:22532750

  13. Infant-feeding practices among African American women: social-ecological analysis and implications for practice.

    PubMed

    Reeves, Elizabeth A; Woods-Giscombé, Cheryl L

    2015-05-01

    Despite extensive evidence supporting the health benefits of breastfeeding, significant disparities exist between rates of breastfeeding among African American women and women of other races. Increasing rates of breastfeeding among African American women can contribute to the improved health of the African American population by decreasing rates of infant mortality and disease and by enhancing cognitive development. Additionally, higher rates of breastfeeding among African American women could foster maternal-child bonding and could contribute to stronger families, healthier relationships, and emotionally healthier adults. The purpose of this article is twofold: (a) to use the social-ecological model to explore the personal, socioeconomic, psychosocial, and cultural factors that affect the infant feeding decision-making processes of African American women and (b) to discuss the implications of these findings for clinical practice and research to eliminate current disparities in rates of breastfeeding. PMID:24810518

  14. Practice size and quality attainment under the new GMS contract: a cross-sectional analysis

    PubMed Central

    Wang, Yingying; O'Donnell, Catherine A; Mackay, Daniel F; Watt, Graham CM

    2006-01-01

    Background The Quality and Outcomes Framework (QOF) of the new General Medical Services contract, for the first time, incentivises certain areas of general practice workload over others. The ability of practices to deliver high quality care may be related to the size of the practice itself. Aim To explore the relationship between practice size and points attained in the QOF. Design of study Cross-sectional analyses of routinely available data. Setting Urban general practice in mainland Scotland. Method QOF points and disease prevalence were obtained for all urban general practices in Scotland (n = 638) and linked to data on the practice, GP and patient population. The relationship between QOF point attainment, disease prevalence and practice size was examined using univariate statistical analyses. Results Smaller practices were more likely to be located in areas of socioeconomic deprivation; had patients with poorer health; and were less likely to participate in voluntary practice-based quality schemes. Overall, smaller practices received fewer QOF points compared to larger practices (P = 0.003), due to lower point attainment in the organisational domain (P = 0.002). There were no differences across practice size in the other domains of the QOF, including clinical care. Smaller practices reported higher levels of chronic obstructive pulmonary disease (COPD) and mental health conditions and lower levels of asthma, epilepsy and hypothyroidism. There was no difference in the reported prevalence of hypertension or coronary heart disease (CHD) across practices, in contrast to CHD mortality for patients aged under 70 years, where the mortality rate was 40% greater for single-handed practices compared with large practices. Conclusions Although smaller practices obtained fewer points than larger practices under the QOF, this was due to lower scores in the organisational domain of the contract rather than to lower scores for clinical care. Single-handed practices, in common

  15. Accuracy Analysis of the PIC Method

    NASA Astrophysics Data System (ADS)

    Verboncoeur, J. P.; Cartwright, K. L.

    2000-10-01

    The discretization errors for many steps of the classical Particle-in-Cell (PIC) model have been well-studied (C. K. Birdsall and A. B. Langdon, Plasma Physics via Computer Simulation, McGraw-Hill, New York, NY (1985).) (R. W. Hockney and J. W. Eastwood, Computer Simulation Using Particles, McGraw-Hill, New York, NY (1981).). In this work, the errors in the interpolation algorithms, which provide the connection between continuum particles and discrete fields, are described in greater detail. In addition, the coupling of errors between steps in the method is derived. The analysis is carried out for both electrostatic and electromagnetic PIC models, and the results are demonstrated using a bounded one-dimensional electrostatic PIC code (J. P. Verboncoeur et al., J. Comput. Phys. 104, 321-328 (1993).), as well as a bounded two-dimensional electromagnetic PIC code (J. P. Verboncoeur et al., Comp. Phys. Comm. 87, 199-211 (1995).).

  16. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  17. Radioisotope method of compound flow analysis

    NASA Astrophysics Data System (ADS)

    Petryka, Leszek; Zych, Marcin; Hanus, Robert; Sobota, Jerzy; Vlasak, Pavel; Malczewska, Beata

    2015-05-01

    The paper presents gamma radiation application to analysis of a multicomponent or multiphase flow. Such information as a selected component content in the mixture transported through pipe is crucial in many industrial or laboratory installations. Properly selected sealed radioactive source and collimators, deliver the photon beam, penetrating cross section of the flow. Detectors mounted at opposite to the source side of the pipe, allow recording of digital signals representing composition of the stream. In the present development of electronics, detectors and computer software, a significant progress in know-how of this field may be observed. The paper describes application of this method to optimization and control of hydrotransport of solid particles and propose monitoring facilitating prevent of a pipe clogging or dangerous oscillations.

  18. Introduction of the carbon dioxide absorption method with closed circle breathing into anesthesia practice.

    PubMed

    Foregger, R

    2000-07-01

    The circle breathing CO2 absorption system for use during acetylene anesthesia was described by Carl Gauss in 1924/1925. The apparatus was manufactured by Drägerwerk of Lübeck. A considerable number of publications on the apparatus employing the closed circle method of CO2 absorption appeared in the medical press soon thereafter. Later apparatus models, also built by Drägerwerk, were adapted for nitrous oxide-oxygen-ether anesthesia and introduced into practice by Paul Sudeck and Helmut Schmidt. Information about all this was transmitted to America through the German medical press, including the Draeger-Hefte. American anesthesia machine manufacturers began to develop closed circle CO2 absorbers several years later. Claims that the circle breathing CO2 absorption method was introduced into anesthesia practice by Brian Sword are not valid. PMID:10969391

  19. Gap analysis: Concepts, methods, and recent results

    USGS Publications Warehouse

    Jennings, M.D.

    2000-01-01

    Rapid progress is being made in the conceptual, technical, and organizational requirements for generating synoptic multi-scale views of the earth's surface and its biological content. Using the spatially comprehensive data that are now available, researchers, land managers, and land-use planners can, for the first time, quantitatively place landscape units - from general categories such as 'Forests' or 'Cold-Deciduous Shrubland Formation' to more categories such as 'Picea glauca-Abies balsamea-Populus spp. Forest Alliance' - in their large-area contexts. The National Gap Analysis Program (GAP) has developed the technical and organizational capabilities necessary for the regular production and analysis of such information. This paper provides a brief overview of concepts and methods as well as some recent results from the GAP projects. Clearly, new frameworks for biogeographic information and organizational cooperation are needed if we are to have any hope of documenting the full range of species occurrences and ecological processes in ways meaningful to their management. The GAP experience provides one model for achieving these new frameworks.

  20. Comparison and cost analysis of drinking water quality monitoring requirements versus practice in seven developing countries.

    PubMed

    Crocker, Jonny; Bartram, Jamie

    2014-07-01

    Drinking water quality monitoring programs aim to support provision of safe drinking water by informing water quality management. Little evidence or guidance exists on best monitoring practices for low resource settings. Lack of financial, human, and technological resources reduce a country's ability to monitor water supply. Monitoring activities were characterized in Cambodia, Colombia, India (three states), Jordan, Peru, South Africa, and Uganda according to water sector responsibilities, monitoring approaches, and marginal cost. The seven study countries were selected to represent a range of low resource settings. The focus was on monitoring of microbiological parameters, such as E. coli, coliforms, and H2S-producing microorganisms. Data collection involved qualitative and quantitative methods. Across seven study countries, few distinct approaches to monitoring were observed, and in all but one country all monitoring relied on fixed laboratories for sample analysis. Compliance with monitoring requirements was highest for operational monitoring of large water supplies in urban areas. Sample transport and labor for sample collection and analysis together constitute approximately 75% of marginal costs, which exclude capital costs. There is potential for substantive optimization of monitoring programs by considering field-based testing and by fundamentally reconsidering monitoring approaches for non-piped supplies. This is the first study to look quantitatively at water quality monitoring practices in multiple developing countries. PMID:25046632

  1. Health Education Specialist Practice Analysis 2015 (HESPA 2015): Process and Outcomes.

    PubMed

    McKenzie, James F; Dennis, Dixie; Auld, M Elaine; Lysoby, Linda; Doyle, Eva; Muenzen, Patricia M; Caro, Carla M; Kusorgbor-Narh, Cynthia S

    2016-06-01

    The Health Education Specialist Practice Analysis 2015 (HESPA 2015) was conducted to update and validate the Areas of Responsibilities, Competencies, and Sub-competencies for Entry- and Advanced-Level Health Education Specialists. Two data collection instruments were developed-one was focused on Sub-competencies and the other on knowledge items related to the practice of health education. Instruments were administered to health education specialists (N = 3,152) using online survey methods. A total of 2,508 survey participants used 4-point ordinal scales to rank Sub-competencies by frequency of use and importance. The other 644 participants used the same 4-point frequency scale to rank related knowledge items. Composite scores for Sub-competencies were calculated and subgroup comparisons were conducted that resulted in the validation of 7 Areas of Responsibilities, 36 Competencies, and 258 Sub-competencies. Of the Sub-competencies, 141 were identified as Entry-level, 76 Advanced 1-level, and 41 Advanced 2-level. In addition, 131 knowledge items were verified. The HESPA 2015 findings are compared with the results of the Health Education Job Analysis 2010 and will be useful to those involved in professional preparation, continuing education, and employment of health education specialists. PMID:27107427

  2. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    NASA Technical Reports Server (NTRS)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  3. Integrated force method versus displacement method for finite element analysis

    NASA Technical Reports Server (NTRS)

    Patnaik, S. N.; Berke, L.; Gallagher, R. H.

    1991-01-01

    A novel formulation termed the integrated force method (IFM) has been developed in recent years for analyzing structures. In this method all the internal forces are taken as independent variables, and the system equilibrium equations (EEs) are integrated with the global compatibility conditions (CCs) to form the governing set of equations. In IFM the CCs are obtained from the strain formulation of St. Venant, and no choices of redundant load systems have to be made, in constrast to the standard force method (SFM). This property of IFM allows the generation of the governing equation to be automated straightforwardly, as it is in the popular stiffness method (SM). In this report IFM and SM are compared relative to the structure of their respective equations, their conditioning, required solution methods, overall computational requirements, and convergence properties as these factors influence the accuracy of the results. Overall, this new version of the force method produces more accurate results than the stiffness method for comparable computational cost.

  4. Integrated force method versus displacement method for finite element analysis

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Berke, Laszlo; Gallagher, Richard H.

    1990-01-01

    A novel formulation termed the integrated force method (IFM) has been developed in recent years for analyzing structures. In this method all the internal forces are taken as independent variables, and the system equilibrium equations (EE's) are integrated with the global compatibility conditions (CC's) to form the governing set of equations. In IFM the CC's are obtained from the strain formulation of St. Venant, and no choices of redundant load systems have to be made, in constrast to the standard force method (SFM). This property of IFM allows the generation of the governing equation to be automated straightforwardly, as it is in the popular stiffness method (SM). In this report IFM and SM are compared relative to the structure of their respective equations, their conditioning, required solution methods, overall computational requirements, and convergence properties as these factors influence the accuracy of the results. Overall, this new version of the force method produces more accurate results than the stiffness method for comparable computational cost.

  5. Reform-based science teaching: A mixed-methods approach to explaining variation in secondary science teacher practice

    NASA Astrophysics Data System (ADS)

    Jetty, Lauren E.

    The purpose of this two-phase, sequential explanatory mixed-methods study was to understand and explain the variation seen in secondary science teachers' enactment of reform-based instructional practices. Utilizing teacher socialization theory, this mixed-methods analysis was conducted to determine the relative influence of secondary science teachers' characteristics, backgrounds and experiences across their teacher development to explain the range of teaching practices exhibited by graduates from three reform-oriented teacher preparation programs. Data for this study were obtained from the Investigating the Meaningfulness of Preservice Programs Across the Continuum of Teaching (IMPPACT) Project, a multi-university, longitudinal study funded by NSF. In the first quantitative phase of the study, data for the sample (N=120) were collected from three surveys from the IMPPACT Project database. Hierarchical multiple regression analysis was used to examine the separate as well as the combined influence of factors such as teachers' personal and professional background characteristics, beliefs about reform-based science teaching, feelings of preparedness to teach science, school context, school culture and climate of professional learning, and influences of the policy environment on the teachers' use of reform-based instructional practices. Findings indicate three blocks of variables, professional background, beliefs/efficacy, and local school context added significant contribution to explaining nearly 38% of the variation in secondary science teachers' use of reform-based instructional practices. The five variables that significantly contributed to explaining variation in teachers' use of reform-based instructional practices in the full model were, university of teacher preparation, sense of preparation for teaching science, the quality of professional development, science content focused professional, and the perceived level of professional autonomy. Using the results

  6. Method and apparatus for frequency spectrum analysis

    NASA Technical Reports Server (NTRS)

    Cole, Steven W. (Inventor)

    1992-01-01

    A method for frequency spectrum analysis of an unknown signal in real-time is discussed. The method is based upon integration of 1-bit samples of signal voltage amplitude corresponding to sine or cosine phases of a controlled center frequency clock which is changed after each integration interval to sweep the frequency range of interest in steps. Integration of samples during each interval is carried out over a number of cycles of the center frequency clock spanning a number of cycles of an input signal to be analyzed. The invention may be used to detect the frequency of at least two signals simultaneously. By using a reference signal of known frequency and voltage amplitude (added to the two signals for parallel processing in the same way, but in a different channel with a sampling at the known frequency and phases of the reference signal), the absolute voltage amplitude of the other two signals may be determined by squaring the sine and cosine integrals of each channel and summing the squares to obtain relative power measurements in all three channels and, from the known voltage amplitude of the reference signal, obtaining an absolute voltage measurement for the other two signals by multiplying the known voltage of the reference signal with the ratio of the relative power of each of the other two signals to the relative power of the reference signal.

  7. Interpolation methods for shaped reflector analysis

    NASA Technical Reports Server (NTRS)

    Galindo-Israel, Victor; Imbriale, William A.; Rahmat-Samii, Yahya; Veruttipong, Thavath

    1988-01-01

    The diffraction analysis of reflector surfaces which are described only at a discrete set of locations usually leads to the requirement of an interpolation to determine the surface characteristics over a continuum of locations. Two methods of interpolation, the global and the local methods, are presented. The global interpolation representation is a closed-form or series expression valid over the entire surface. The coefficients of a series expression are found by an integration of all of the raw data. Since the number of coefficients used to describe the surface is much smaller than the number of raw data points, the integration effectively provides a smoothing of the raw data. The local interpolation provides a closed-form expression for only a small area of the reflector surface. The subreflector is divided into sectors each of which has constant discretized data. Each area segment is then locally described by a two-dimensional quadratic surface. The second derivative data give the desired smoothed values.

  8. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Service, Dairy Programs, or Official Methods of Analysis of the Association of Analytical Chemists or... 7 Agriculture 3 2011-01-01 2011-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods...

  9. Ad hoc supervision of general practice registrars as a 'community of practice': analysis, interpretation and re-presentation.

    PubMed

    Clement, T; Brown, J; Morrison, J; Nestel, D

    2016-05-01

    General practice registrars in Australia undertake most of their vocational training in accredited general practices. They typically see patients alone from the start of their community-based training and are expected to seek timely ad hoc support from their supervisor. Such ad hoc encounters are a mechanism for ensuring patient safety, but also provide an opportunity for learning and teaching. Wenger's (Communities of practice: learning, meaning, and identity. Cambridge University Press, New York, 1998) social theory of learning ('communities of practice') guided a secondary analysis of audio-recordings of ad hoc encounters. Data from one encounter is re-presented as an extended sequence to maintain congruence with the theoretical perspective and enhance vicariousness. An interpretive commentary communicates key features of Wenger's theory and highlights the researchers' interpretations. We argue that one encounter can reveal universal understandings of clinical supervision and that the process of naturalistic generalisation allows readers to transfer others' experiences to their own contexts. The paper raises significant analytic, interpretive, and representational issues. We highlight that report writing is an important, but infrequently discussed, part of research design. We discuss the challenges of supporting the learning and teaching that arises from adopting a socio-cultural lens and argue that such a perspective importantly captures the complex range of issues that work-based practitioners have to grapple with. This offers a challenge to how we research and seek to influence work-based learning and teaching in health care settings. PMID:26384813

  10. Empirical Analysis of Green Supply Chain Management Practices in Indian Automobile Industry

    NASA Astrophysics Data System (ADS)

    Luthra, S.; Garg, D.; Haleem, A.

    2014-04-01

    Environmental sustainability and green environmental issues have an increasing popularity among researchers and supply chain practitioners. An attempt has been made to identify and empirically analyze green supply chain management (GSCM) practices in Indian automobile industry. Six main GSCM practices (having 37 sub practices) and four expected performance outcomes (having 16 performances) have been identified by implementing GSCM practices from literature review. Questionnaire based survey has been made to validate these practices and performance outcomes. 123 complete questionnaires were collected from Indian automobile organizations and used for empirical analysis of GSCM practices in Indian automobile industry. Descriptive statistics have been used to know current implementation status of GSCM practices in Indian automobile industry and multiple regression analysis has been carried out to know the impact on expected organizational performance outcomes by current GSCM practices adopted by Indian automobile industry. The results of study suggested that environmental, economic, social and operational performances improve with the implementation of GSCM practices. This paper may play an important role to understand various GSCM implementation issues and help practicing managers to improve their performances in the supply chain.

  11. Introducing and Integrating Gifted Education into an Existing Independent School: An Analysis of Practice

    ERIC Educational Resources Information Center

    McKibben, Stephen

    2013-01-01

    In this analysis of practice, I conduct a combination formative and summative program evaluation of an initiative introduced to serve gifted learners at The Ocean School (TOS), an independent, Pre-K-grade 8 day school located in a rural area of the West Coast. Using the best practices as articulated by the National Association of Gifted Children…

  12. Nursing Faculty Decision Making about Best Practices in Test Construction, Item Analysis, and Revision

    ERIC Educational Resources Information Center

    Killingsworth, Erin Elizabeth

    2013-01-01

    With the widespread use of classroom exams in nursing education there is a great need for research on current practices in nursing education regarding this form of assessment. The purpose of this study was to explore how nursing faculty members make decisions about using best practices in classroom test construction, item analysis, and revision in…

  13. Common Goals for the Science and Practice of Behavior Analysis: A Response to Critchfield

    ERIC Educational Resources Information Center

    Schneider, Susan M.

    2012-01-01

    In his scholarly and thoughtful article, "Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis," Critchfield (2011) discussed the science-practice frictions to be expected in any professional organization that attempts to combine these interests. He suggested that the Association for Behavior Analysis…

  14. Reporting Practices in Confirmatory Factor Analysis: An Overview and Some Recommendations

    ERIC Educational Resources Information Center

    Jackson, Dennis L.; Gillaspy, J. Arthur, Jr.; Purc-Stephenson, Rebecca

    2009-01-01

    Reporting practices in 194 confirmatory factor analysis studies (1,409 factor models) published in American Psychological Association journals from 1998 to 2006 were reviewed and compared with established reporting guidelines. Three research questions were addressed: (a) how do actual reporting practices compare with published guidelines? (b) how…

  15. A practical field extraction method for non-invasive monitoring of hormone activity in the black rhinoceros.

    PubMed

    Edwards, Katie L; McArthur, Hannah M; Liddicoat, Tim; Walker, Susan L

    2014-01-01

    Non-invasive hormone analysis is a vital tool in assessing an animal's adrenal and reproductive status, which can be beneficial to in situ and ex situ conservation. However, it can be difficult to employ these techniques when monitoring in situ populations away from controlled laboratory conditions, when electricity is not readily available. A practical method for processing faecal samples in the field, which enables samples to be extracted soon after defaecation and stored in field conditions for prolonged periods prior to hormone analysis, is therefore warranted. This study describes the development of an optimal field extraction method, which includes hand-shaking faecal material in 90% methanol, before loading this extract in a 40% solvent onto HyperSep™ C8 solid-phase extraction cartridges, stored at ambient temperatures. This method was successfully validated for measurement of adrenal and reproductive hormone metabolites in faeces of male and female black rhinoceros (Diceros bicornis) and was rigorously tested in controlled laboratory and simulated field conditions. All the hormones tested demonstrated between 83 and 94% and between 42 and 89% recovery of synthetic and endogenous hormone metabolites, respectively, with high precision of replication. Furthermore, results obtained following the developed optimal field extraction method were highly correlated with the control laboratory method. Cartridges can be stored at ambient (cool, dry or warm, humid) conditions for periods of up to 6 months without degradation, before re-extraction of hormone metabolites for analysis by enzyme immunoassay. The described method has great potential to be applied to monitor faecal reproductive and adrenal hormone metabolites in a wide variety of species and allows samples to be stored in the field for up to 6 months prior to analysis. This provides the opportunity to investigate hormone relationships within in situ populations, where equipment and facilities may

  16. Practical use of three-dimensional inverse method for compressor blade design

    SciTech Connect

    Damle, S.; Dang, T.; Stringham, J.; Razinsky, E.

    1999-04-01

    The practical utility of a three-dimensional inverse viscous method is demonstrated by carrying out a design modification of a first-stage rotor in an industrial compressor. In this design modification study, the goal is to improve the efficiency of the original blade while retaining its overall aerodynamic, structural, and manufacturing characteristics. By employing a simple modification to the blade pressure loading distribution (which is the prescribed flow quantity in this inverse method), the modified blade geometry is predicted to perform better than the original design over a wide range of operating points, including an improvement in choke margin.

  17. Instructional methods used by health sciences librarians to teach evidence-based practice (EBP): a systematic review*†‡

    PubMed Central

    Swanberg, Stephanie M.; Dennison, Carolyn Ching; Farrell, Alison; Machel, Viola; Marton, Christine; O'Brien, Kelly K.; Pannabecker, Virginia; Thuna, Mindy; Holyoke, Assako Nitta

    2016-01-01

    Background Librarians often teach evidence-based practice (EBP) within health sciences curricula. It is not known what teaching methods are most effective. Methods A systematic review of the literature was conducted searching CINAHL, EMBASE, ERIC, LISTA, PubMed, Scopus, and others. Searches were completed through December 2014. No limits were applied. Hand searching of Medical Library Association annual meeting abstracts from 2009–2014 was also completed. Studies must be about EBP instruction by a librarian within undergraduate or graduate health sciences curricula and include skills assessment. Studies with no assessment, letters and comments, and veterinary education studies were excluded. Data extraction and critical appraisal were performed to determine the risk of bias of each study. Results Twenty-seven studies were included for analysis. Studies occurred in the United States (20), Canada (3), the United Kingdom (1), and Italy (1), with 22 in medicine and 5 in allied health. Teaching methods included lecture (20), small group or one-on-one instruction (16), computer lab practice (15), and online learning (6). Assessments were quizzes or tests, pretests and posttests, peer-review, search strategy evaluations, clinical scenario assignments, or a hybrid. Due to large variability across studies, meta-analysis was not conducted. Discussion Findings were weakly significant for positive change in search performance for most studies. Only one study compared teaching methods, and no one teaching method proved more effective. Future studies could conduct multisite interventions using randomized or quasi-randomized controlled trial study design and standardized assessment tools to measure outcomes. PMID:27366120

  18. A practical material decomposition method for x-ray dual spectral computed tomography.

    PubMed

    Hu, Jingjing; Zhao, Xing

    2016-03-17

    X-ray dual spectral CT (DSCT) scans the measured object with two different x-ray spectra, and the acquired rawdata can be used to perform the material decomposition of the object. Direct calibration methods allow a faster material decomposition for DSCT and can be separated in two groups: image-based and rawdata-based. The image-based method is an approximative method, and beam hardening artifacts remain in the resulting material-selective images. The rawdata-based method generally obtains better image quality than the image-based method, but this method requires geometrically consistent rawdata. However, today's clinical dual energy CT scanners usually measure different rays for different energy spectra and acquire geometrically inconsistent rawdata sets, and thus cannot meet the requirement. This paper proposes a practical material decomposition method to perform rawdata-based material decomposition in the case of inconsistent measurement. This method first yields the desired consistent rawdata sets from the measured inconsistent rawdata sets, and then employs rawdata-based technique to perform material decomposition and reconstruct material-selective images. The proposed method was evaluated by use of simulated FORBILD thorax phantom rawdata and dental CT rawdata, and simulation results indicate that this method can produce highly quantitative DSCT images in the case of inconsistent DSCT measurements. PMID:27257878

  19. Language Ideology or Language Practice? An Analysis of Language Policy Documents at Swedish Universities

    ERIC Educational Resources Information Center

    Björkman, Beyza

    2014-01-01

    This article presents an analysis and interpretation of language policy documents from eight Swedish universities with regard to intertextuality, authorship and content analysis of the notions of language practices and English as a lingua franca (ELF). The analysis is then linked to Spolsky's framework of language policy, namely language…

  20. Computing the period of light variability in blazar objects using the periodogram spectral analysis method

    NASA Astrophysics Data System (ADS)

    Tang, J.; Zhang, X.; Wu, L.

    2007-10-01

    The periodogram spectral analysis method for equally spaced data is discussed and the method is tested with modeling signals. The effectiveness of the periodogram spectral analysis is confirmed by applications in noise series. The method has been applied to analyze the period of the Blazar 3C 279,3C 345 and BL Lac Objects OJ 287,ON 231.Their periods are 7.14yr, 10.00yr, 11.76 yr and 6.80yr, which are consistent with other documents in Jurkevich method. The results are satisfying. The obtained periods are helpful to understand physical mechanisms of Blazars. The paper analyzes the influence of window function. Moreover, their advantages and disadvantages are discussed for the practical applications. The application results also indicate that in comparison to other traditional prediction methods, the prediction method used in this paper has a higher prediction accuracy. Thus it has theoretical meaning and practical value for the period of light variation prediction.

  1. Trends in vasectomy. Analysis of one teaching practice.

    PubMed Central

    Reynolds, J. L.

    1998-01-01

    PROBLEM BEING ADDRESSED: How can a teaching practice develop a referral service and incorporate educational opportunities for family medicine residents, clinical clerks, and community family physicians? OBJECTIVE OF PROGRAM: To develop a high-quality vasectomy service within a teaching practice to change the surgical procedure to the no-scalpel vasectomy (NSV) technique; to educate family medicine residents, clinical clerks, and community family physicians about vasectomy and the NSV technique; and to monitor outcomes and compare them with published results. MAIN COMPONENTS OF PROGRAM: The program took place in an urban family medicine residency program. Data on number of procedures, types of patients choosing vasectomy, and outcomes are presented, along with information on number of learners who viewed, assisted with, or became competent to perform NSV. CONCLUSIONS: A few family medicine residents and some interested community physicians could be trained to perform NSV competently. Involving learners in the procedure does not seem to change the rate of complications. Images Figure 1 PMID:9559195

  2. Student Teachers' Outlook on Teaching: A Content Analysis of Their Reflective Reports on Experiences in Practice.

    ERIC Educational Resources Information Center

    Beijaard, D.; And Others

    1997-01-01

    Content analysis of 34 student teachers' reflections on practice found that most were descriptive, making it difficult to discern their learning processes. Field-based teacher education should promote more structured ways of thinking and reflecting. (SK)

  3. Grid Analysis and Display System (GrADS): A practical tool for earth science visualization

    NASA Technical Reports Server (NTRS)

    Kinter, James L., III; Doty, Brian E.

    1991-01-01

    Viewgraphs on grid analysis and display system (GrADS): a practical tool for earth science visualization are presented. Topics covered include: GrADS design goals; data sets; and temperature profiles.

  4. Flutter and Divergence Analysis using the Generalized Aeroelastic Analysis Method

    NASA Technical Reports Server (NTRS)

    Edwards, John W.; Wieseman, Carol D.

    2003-01-01

    The Generalized Aeroelastic Analysis Method (GAAM) is applied to the analysis of three well-studied checkcases: restrained and unrestrained airfoil models, and a wing model. An eigenvalue iteration procedure is used for converging upon roots of the complex stability matrix. For the airfoil models, exact root loci are given which clearly illustrate the nature of the flutter and divergence instabilities. The singularities involved are enumerated, including an additional pole at the origin for the unrestrained airfoil case and the emergence of an additional pole on the positive real axis at the divergence speed for the restrained airfoil case. Inconsistencies and differences among published aeroelastic root loci and the new, exact results are discussed and resolved. The generalization of a Doublet Lattice Method computer code is described and the code is applied to the calculation of root loci for the wing model for incompressible and for subsonic flow conditions. The error introduced in the reduction of the singular integral equation underlying the unsteady lifting surface theory to a linear algebraic equation is discussed. Acknowledging this inherent error, the solutions of the algebraic equation by GAAM are termed 'exact.' The singularities of the problem are discussed and exponential series approximations used in the evaluation of the kernel function shown to introduce a dense collection of poles and zeroes on the negative real axis. Again, inconsistencies and differences among published aeroelastic root loci and the new 'exact' results are discussed and resolved. In all cases, aeroelastic flutter and divergence speeds and frequencies are in good agreement with published results. The GAAM solution procedure allows complete control over Mach number, velocity, density, and complex frequency. Thus all points on the computed root loci can be matched-point, consistent solutions without recourse to complex mode tracking logic or dataset interpolation, as in the k and p

  5. EMQN Best Practice Guidelines for molecular and haematology methods for carrier identification and prenatal diagnosis of the haemoglobinopathies

    PubMed Central

    Traeger-Synodinos, Joanne; Harteveld, Cornelis L; Old, John M; Petrou, Mary; Galanello, Renzo; Giordano, Piero; Angastioniotis, Michael; De la Salle, Barbara; Henderson, Shirley; May, Alison

    2015-01-01

    Haemoglobinopathies constitute the commonest recessive monogenic disorders worldwide, and the treatment of affected individuals presents a substantial global disease burden. Carrier identification and prenatal diagnosis represent valuable procedures that identify couples at risk for having affected children, so that they can be offered options to have healthy offspring. Molecular diagnosis facilitates prenatal diagnosis and definitive diagnosis of carriers and patients (especially ‘atypical' cases who often have complex genotype interactions). However, the haemoglobin disorders are unique among all genetic diseases in that identification of carriers is preferable by haematological (biochemical) tests rather than DNA analysis. These Best Practice guidelines offer an overview of recommended strategies and methods for carrier identification and prenatal diagnosis of haemoglobinopathies, and emphasize the importance of appropriately applying and interpreting haematological tests in supporting the optimum application and evaluation of globin gene DNA analysis. PMID:25052315

  6. EMQN Best Practice Guidelines for molecular and haematology methods for carrier identification and prenatal diagnosis of the haemoglobinopathies.

    PubMed

    Traeger-Synodinos, Joanne; Harteveld, Cornelis L; Old, John M; Petrou, Mary; Galanello, Renzo; Giordano, Piero; Angastioniotis, Michael; De la Salle, Barbara; Henderson, Shirley; May, Alison

    2015-04-01

    Haemoglobinopathies constitute the commonest recessive monogenic disorders worldwide, and the treatment of affected individuals presents a substantial global disease burden. Carrier identification and prenatal diagnosis represent valuable procedures that identify couples at risk for having affected children, so that they can be offered options to have healthy offspring. Molecular diagnosis facilitates prenatal diagnosis and definitive diagnosis of carriers and patients (especially 'atypical' cases who often have complex genotype interactions). However, the haemoglobin disorders are unique among all genetic diseases in that identification of carriers is preferable by haematological (biochemical) tests rather than DNA analysis. These Best Practice guidelines offer an overview of recommended strategies and methods for carrier identification and prenatal diagnosis of haemoglobinopathies, and emphasize the importance of appropriately applying and interpreting haematological tests in supporting the optimum application and evaluation of globin gene DNA analysis. PMID:25052315

  7. Using task analysis to generate evidence for strengthening midwifery education, practice, and regulation in Ethiopia

    PubMed Central

    Yigzaw, Tegbar; Carr, Catherine; Stekelenburg, Jelle; van Roosmalen, Jos; Gibson, Hannah; Gelagay, Mintwab; Admassu, Azeb

    2016-01-01

    Purpose Realizing aspirations for meeting the global reproductive, maternal, newborn, and child health goals depends not only on increasing the numbers but also on improving the capability of midwifery workforce. We conducted a task analysis study to identify the needs for strengthening the midwifery workforce in Ethiopia. Methods We conducted a cross-sectional study of recently qualified midwives in Ethiopia. Purposively selected participants from representative geographic and practice settings completed a self-administered questionnaire, making judgments about the frequency of performance, criticality, competence, and location of training for a list of validated midwifery tasks. Using Statistical Package for the Social Sciences, Version 20, we computed the percentages and averages to describe participant and practice characteristics. We identified priority preservice education gaps by considering the tasks least frequently learned in preservice, most frequently mentioned for not being trained, and had the highest not capable response. Identification of top priorities for in-service training considered tasks with highest “not capable” and “never” done responses. We determined the licensing exam blueprint by weighing the composite mean scores for frequency and criticality variables and expert rating across practice categories. Results One hundred and thirty-eight midwives participated in the study. The majority of respondents recognized the importance of midwifery tasks (89%), felt they were capable (91.8%), reported doing them frequently (63.9%), and learned them during preservice education (56.3%). We identified competence gaps in tasks related to obstetric complications, gynecology, public health, professional duties, and prevention of mother to child transmission of HIV. Moreover, our study helped to determine composition of the licensing exam for university graduates. Conclusion The task analysis indicates that midwives provide critical reproductive

  8. Practical method using superposition of individual magnetic fields for initial arrangement of undulator magnets

    SciTech Connect

    Tsuchiya, K.; Shioya, T.

    2015-04-15

    We have developed a practical method for determining an excellent initial arrangement of magnetic arrays for a pure-magnet Halbach-type undulator. In this method, the longitudinal magnetic field distribution of each magnet is measured using a moving Hall probe system along the beam axis with a high positional resolution. The initial arrangement of magnetic arrays is optimized and selected by analyzing the superposition of all distribution data in order to achieve adequate spectral quality for the undulator. We applied this method to two elliptically polarizing undulators (EPUs), called U#16-2 and U#02-2, at the Photon Factory storage ring (PF ring) in the High Energy Accelerator Research Organization (KEK). The measured field distribution of the undulator was demonstrated to be excellent for the initial arrangement of the magnet array, and this method saved a great deal of effort in adjusting the magnetic fields of EPUs.

  9. Thermophysical Analysis of Rocks by Transient Methods

    NASA Astrophysics Data System (ADS)

    Kubicar, Ludovit; Vretenar, Viliam; Bohac, Vlastimil; Stofanik, Vladimir; Dieska, Peter

    2010-05-01

    Rocks belong to porous materials where content of pore significantly influence its thermophysical properties. Porous structure plays an important role in heat and fluid transport. A set of effects can be found in such structures like freezing, thawing, evaporation, etc. Highly innovative testing technique based on transient methods has been used for thermophysical analysis in which the specific heat, thermal diffusivity and thermal conductivity is determined, anomalies of the thermophysical parameters connected with freezing and thawing of water in pores are measured, propagation of freezing and thawing fronts are detected and diffusion of moisture in pore structure is monitored on sandstone and Gioia marble. Pulse transient method is based on generation of a heat disturbance by plane heat source fixed in the sample. Specific heat, thermal diffusivity and thermal conductivity are determined from the parameters of the temperature response to this heat disturbance. A heat source in the step-wise heating regime was applied to the frozen sample to study basic characteristics of the propagation of thawing front. In both experiments a sample is used that consists of three parts of size 50x50x10 mm assembled in a rectangle 50x50x30 mm where in first contact of the specimen parts a plane of heat source and in the second one a thermometer is fixed. The sample has to be conditioned prior the measurement to obtain the required thermodynamic state, i.e. the initial temperature and moisture stage. An appropriate heating and cooling regime allows measuring the anomalies of specific heat, thermal conductivity and thermal diffusivity connected with freezing and thawing. The hot ball transient method for measuring thermal conductivity is used for monitoring the moisture diffusion. Principle of the hot ball method is based on generation of the heat in a step-wise regime by a sensor in a form of small ball in diameter of 2 mm that, in addition, it monitors its temperature. A

  10. Exploring the Current Landscape of Intravenous Infusion Practices and Errors (ECLIPSE): protocol for a mixed-methods observational study

    PubMed Central

    Blandford, Ann; Furniss, Dominic; Chumbley, Gill; Iacovides, Ioanna; Wei, Li; Cox, Anna; Mayer, Astrid; Schnock, Kumiko; Bates, David Westfall; Dykes, Patricia C; Bell, Helen; Dean Franklin, Bryony

    2016-01-01

    Introduction Intravenous medication is essential for many hospital inpatients. However, providing intravenous therapy is complex and errors are common. ‘Smart pumps’ incorporating dose error reduction software have been widely advocated to reduce error. However, little is known about their effect on patient safety, how they are used or their likely impact. This study will explore the landscape of intravenous medication infusion practices and errors in English hospitals and how smart pumps may relate to the prevalence of medication administration errors. Methods and analysis This is a mixed-methods study involving an observational quantitative point prevalence study to determine the frequency and types of errors that occur in the infusion of intravenous medication, and qualitative interviews with hospital staff to better understand infusion practices and the contexts in which errors occur. The study will involve 5 clinical areas (critical care, general medicine, general surgery, paediatrics and oncology), across 14 purposively sampled acute hospitals and 2 paediatric hospitals to cover a range of intravenous infusion practices. Data collectors will compare each infusion running at the time of data collection against the patient's medication orders to identify any discrepancies. The potential clinical importance of errors will be assessed. Quantitative data will be analysed descriptively; interviews will be analysed using thematic analysis. Ethics and dissemination Ethical approval has been obtained from an NHS Research Ethics Committee (14/SC/0290); local approvals will be sought from each participating organisation. Findings will be published in peer-reviewed journals and presented at conferences for academic and health professional audiences. Results will also be fed back to participating organisations to inform local policy, training and procurement. Aggregated findings will inform the debate on costs and benefits of the NHS investing in smart pump technology

  11. Stability analysis of lattice Boltzmann methods

    SciTech Connect

    Sterling, J.D.; Chen, Shiyi

    1996-01-01

    The lattice Boltzmann equation describes the evolution of the velocity distribution function on a lattice in a manner that macroscopic fluid dynamical behavior is recovered. Although the equation is a derivative of lattice gas automata, it may be interpreted as a Lagrangian finite-difference method for the numerical simulation of the discrete-velocity Boltzmann equation that makes use of a BGK collision operator. As a result, it is not surprising that numericaI instability of lattice Boltzmann methods have been frequently encountered by researchers. We present an analysis of the stability of perturbations of the particle populations linearized about equilibrium values corresponding to a constant-density uniform mean flow. The linear stability depends on the following parameters: the distribution of the mass at a site between the different discrete speeds, the BGK relaxation time, the mean velocity, and the wave-number of the perturbations. This parameter space is too large to compute the complete stability characteristics. We report some stability results for a subset of the parameter space for a 7-velocity hexagonal lattice, a 9-velocity square lattice, and a 15-velocity cubic lattice. Results common to all three lattices are (1) the BGK relaxation time {tau} must be greater than 1/2 corresponding to positive shear viscosity, (2) there exists a maximum stable mean velocity for fixed values of theother parameters, and (3) as {tau} is increased from 1/2 the maximum stable velocity increases monotonically until some fixed velocity is reached which does not change for larger {tau}.

  12. A Novel Method for Dissolved Phosphorus Analysis

    NASA Astrophysics Data System (ADS)

    Berry, J. M.; Spiese, C. E.

    2012-12-01

    High phosphorus loading is a major problem in the Great Lakes watershed. Phosphate enters waterways via both point and non-point sources (e.g., runoff, tile drainage, etc.), promoting eutrophication, and ultimately leading to algal blooms, hypoxia and loss of aquatic life. Quantification of phosphorus loading is typically done using the molybdenum blue method, which is known to have significant drawbacks. The molybdenum blue method requires strict control on time, involves toxic reagents that have limited shelf-life, and is generally unable to accurately measure sub-micromolar concentrations. This study aims to develop a novel reagent that will overcome many of these problems. Ethanolic europium(III) chloride and 8-hydroxyquinoline-5-sulfonic acid (hqs) were combined to form the bis-hqs complex (Eu-hqs). Eu-hqs was synthesized as the dipotassium salt via a simple one-pot procedure. This complex was found to be highly fluorescent (λex = 360 nm, λem = 510 nm) and exhibited a linear response upon addition of monohydrogen phosphate. The linear response ranged from 0.5 - 25 μM HPO42- (15.5 - 775 μg P L-1). It was also determined that Eu-hqs formed a 1:1 complex with phosphate. Maximum fluorescence was found at a pH of 8.50, and few interferences from other ions were found. Shelf-life of the reagent was at least one month, twice as long as most of the molybdenum blue reagent formulations. In the future, field tests will be undertaken in local rivers, lakes, and wetlands to determine the applicability of the complex to real-world analysis.

  13. Unsaturated Shear Strength and Numerical Analysis Methods for Unsaturated Soils

    NASA Astrophysics Data System (ADS)

    Kim, D.; Kim, G.; Kim, D.; Baek, H.; Kang, S.

    2011-12-01

    The angles of shearing resistance(φb) and internal friction(φ') appear to be identical in low suction range, but the angle of shearing resistance shows non-linearity as suction increases. In most numerical analysis however, a fixed value for the angle of shearing resistance is applied even in low suction range for practical reasons, often leading to a false conclusion. In this study, a numerical analysis has been undertaken employing the estimated shear strength curve of unsaturated soils from the residual water content of SWCC proposed by Vanapalli et al.(1996). The result was also compared with that from a fixed value of φb. It is suggested that, in case it is difficult to measure the unsaturated shear strength curve through the triaxial soil tests, the estimated shear strength curve using the residual water content can be a useful alternative. This result was applied for analyzing the slope stablity of unsaturated soils. The effects of a continuous rainfall on slope stability were analyzed using a commercial program "SLOPE/W", with the coupled infiltration analysis program "SEEP/W" from the GEO-SLOPE International Ltd. The results show that, prior to the infiltration by the intensive rainfall, the safety factors using the estimated shear strength curve were substantially higher than that from the fixed value of φb at all time points. After the intensive infiltration, both methods showed a similar behavior.

  14. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-05-25

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  15. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-11-23

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  16. Effect of practice management softwares among physicians of developing countries with special reference to Indian scenario by Mixed Method Technique

    PubMed Central

    Davey, Sanjeev; Davey, Anuradha

    2015-01-01

    Introduction: Currently, many cheaper “practice management software” (PMS) are available in developing countries including India; despite their availability and benefits, its penetration and usage vary from low to moderate level, justifying the importance of this study area. Materials and Methods: First preferred reporting items for systematic-review and meta-analysis (2009) guidelines were considered; followed by an extensive systematic-review of available studies in literature related to developing countries, on key search term from main abstracting databases: PubMed, EMBASE, EBSCO, BIO-MED Central, Cochrane Library, world CAT-library till 15 June 2014; where any kind of article whether published or unpublished, in any sort or form or any language indicating the software usage were included. Thereafter, meta-analysis on Indian studies revealing the magnitude of usage in Indian scenario by Open Meta-(analyst) software using binary random effects (REs) model was done. Studies from developed countries were excluded in our study. Results: Of 57 studies included in a systematic review from developing countries, only 4 Indian studies were found eligible for meta-analysis. RE model revealed although not-significant results (total participants = 243,526; range: 100–226,228, overall odds ratio = 2.85, 95% confidence interval = P < 0.05 and tests for heterogeneity: Q [df = 3] = 0.8 Het. P = 0.85). The overall magnitude of usage of PMS on Indian physicians practice was however found between 10% and 45%. Conclusion: Although variable and nonsignificant effect of usage of PM software on practice of physicians in developing countries like India was found; there is a need to recognize the hidden potential of this system. Hence, more in-depth research in future needs to be done, in order to find a real impact of this system. PMID:25949969

  17. Knowledge, attitude and practice of Malay folk methods in family planning.

    PubMed

    Ab Razak, R

    1985-01-01

    This paper presents findings from a follow-up survey to the 1982 Malaysian Health and Family Planning Survey in Johore and Perak states. The survey aimed to provide more information on traditional methods of contraception and their practice by specific socioeconomic groups, to assess the use of folk methods, and to gauge the perception of effectiveness. The sample includes 1616 women. Findings indicate that people were more familiar with modern methods, particularly the pill. 33.4% of respondents had ever heard of a folk method, such as incantations, exercise, "majun," the Indonesian pill, applications of heat to the abdomen, and herbal preparations. Among the knowledgeable folk methods population, 16.6% had ever heard of herbs, 14.7% knew about jamu, 11.2% knew about majun, 6.3% knew about exercise, 3.3% knew about Indonesian pills, and 1.2% knew about heat applications. 19.8% knew about a variety of other folk methods that were not classified by kind. 62% knew about the traditional method of rhythm; 41.4% knew about withdrawal; and 34.2% knew about abstinence. Knowledge of these three traditional methods was highest among the Chinese. Knowledge of folk methods was highest among the Malays (79.2%). Only 3.5% of Chinese and 2.9% of Indians knew about folk methods. 64% of respondents had ever used modern methods, and about 49% had ever used traditional methods or folk methods. The most popular method of current use was the pill (13.7%), followed by the condom (11.7). Rhythm was the most popular traditional method (7.1%) among current users. 6.2% currently used folk methods. 46.0% currently used some form of contraception. Modern method use was higher among the Chinese, and sterilization was higher among Indians. Knowledge of folk methods increased with an increase in level of education and age. Folk use was higher in urban areas. 46.8% of Malay ever users of folk methods perceived it was very effective, and 45.0% considered it somewhat effective. 70.8% of Malay

  18. A Comparison of Low and High Structure Practice for Learning Interactional Analysis Skills

    ERIC Educational Resources Information Center

    Davis, Matthew James

    2011-01-01

    Innovative training approaches in work domains such as professional athletics, aviation, and the military have shown that specific types of practice can reliably lead to higher levels of performance for the average professional. This study describes the development of an initial effort toward creating a similar practice method for psychotherapy…

  19. Visual cluster analysis and pattern recognition methods

    DOEpatents

    Osbourn, Gordon Cecil; Martinez, Rubel Francisco

    2001-01-01

    A method of clustering using a novel template to define a region of influence. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques.

  20. Evidence-based vaccination strategies in obstetrics and gynecology settings: Current practices and methods for assessment.

    PubMed

    O'Leary, Sean T; Pyrzanowski, Jennifer; Brewer, Sarah E; Dickinson, L Miriam; Dempsey, Amanda F

    2016-04-01

    Obstetrician-gynecologists have the potential to play an important role in the delivery of immunizations to women. However, despite national recommendations, immunization rates among pregnant women and adults in general remain low. Pragmatic immunization delivery trials are needed to demonstrate how best to deliver vaccines in such settings. We report the development and implementation of 2 novel methodologies for immunization delivery research and quality improvement in such settings. The first was the development and application of a 47-point Immunization Delivery Scale that formally assessed variability among practices in their engagement in a variety of evidence-based practices for improving immunization rates. The second was a covariate-constrained randomization technique - a method for achieving balance between study arms in cluster-randomized trials that is especially applicable to pragmatic trials.. To best achieve meaningful and interpretable findings, we recommend use of these or similar techniques in future immunization research and quality improvement projects in OB/GYN settings. PMID:26829978

  1. Echinacea purpurea: Pharmacology, phytochemistry and analysis methods.

    PubMed

    Manayi, Azadeh; Vazirian, Mahdi; Saeidnia, Soodabeh

    2015-01-01

    Echinacea purpurea (Asteraceae) is a perennial medicinal herb with important immunostimulatory and anti-inflammatory properties, especially the alleviation of cold symptoms. The plant also attracted scientists' attention to assess other aspects of its beneficial effects. For instance, antianxiety, antidepression, cytotoxicity, and antimutagenicity as induced by the plant have been revealed in various studies. The findings of the clinical trials are controversial in terms of side effects. While some studies revealed the beneficial effects of the plant on the patients and no severe adverse effects, some others have reported serious side effects including abdominal pain, angioedema, dyspnea, nausea, pruritus, rash, erythema, and urticaria. Other biological activities of the plant such as antioxidant, antibacterial, antiviral, and larvicidal activities have been reported in previous experimental studies. Different classes of secondary metabolites of the plant such as alkamides, caffeic acid derivatives, polysaccharides, and glycoproteins are believed to be biologically and pharmacologically active. Actually, concurrent determination and single analysis of cichoric acid and alkamides have been successfully developed mainly by using high-performance liquid chromatography (HPLC) coupled with different detectors including UV spectrophotometric, coulometric electrochemical, and electrospray ionization mass spectrometric detectors. The results of the studies which were controversial revealed that in spite of major experiments successfully accomplished using E. purpurea, many questions remain unanswered and future investigations may aim for complete recognition of the plant's mechanism of action using new, complementary methods. PMID:26009695

  2. Echinacea purpurea: Pharmacology, phytochemistry and analysis methods

    PubMed Central

    Manayi, Azadeh; Vazirian, Mahdi; Saeidnia, Soodabeh

    2015-01-01

    Echinacea purpurea (Asteraceae) is a perennial medicinal herb with important immunostimulatory and anti-inflammatory properties, especially the alleviation of cold symptoms. The plant also attracted scientists’ attention to assess other aspects of its beneficial effects. For instance, antianxiety, antidepression, cytotoxicity, and antimutagenicity as induced by the plant have been revealed in various studies. The findings of the clinical trials are controversial in terms of side effects. While some studies revealed the beneficial effects of the plant on the patients and no severe adverse effects, some others have reported serious side effects including abdominal pain, angioedema, dyspnea, nausea, pruritus, rash, erythema, and urticaria. Other biological activities of the plant such as antioxidant, antibacterial, antiviral, and larvicidal activities have been reported in previous experimental studies. Different classes of secondary metabolites of the plant such as alkamides, caffeic acid derivatives, polysaccharides, and glycoproteins are believed to be biologically and pharmacologically active. Actually, concurrent determination and single analysis of cichoric acid and alkamides have been successfully developed mainly by using high-performance liquid chromatography (HPLC) coupled with different detectors including UV spectrophotometric, coulometric electrochemical, and electrospray ionization mass spectrometric detectors. The results of the studies which were controversial revealed that in spite of major experiments successfully accomplished using E. purpurea, many questions remain unanswered and future investigations may aim for complete recognition of the plant's mechanism of action using new, complementary methods. PMID:26009695

  3. Dynamic mechanical analysis: A practical introduction to techniques and applications

    SciTech Connect

    Menard, K.

    1999-03-01

    This introduction provides the chemist, chemical engineer, or materials scientists with a starting point to understand the applications of dynamic mechanical analysis, its workings, and its advantages and limitations. This book serves as a systematic study of manufacturing polymeric materials and components as well as for developing new materials. Contents include: introduction to dynamic mechanical analysis; basic rheological concepts: stress, strain, and flow; rheology basic: creep-recovery and stress relaxation; dynamic testing; time-temperature scans part 1: transitions in polymers; time and temperature studies part 2: thermosets; frequency scans; DMA applications to real problems: guidelines; and appendix: sample experiments for the DMA.

  4. Benthic macroinvertebrates in lake ecological assessment: A review of methods, intercalibration and practical recommendations.

    PubMed

    Poikane, Sandra; Johnson, Richard K; Sandin, Leonard; Schartau, Ann Kristin; Solimini, Angelo G; Urbanič, Gorazd; Arbačiauskas, Kęstutis; Aroviita, Jukka; Gabriels, Wim; Miler, Oliver; Pusch, Martin T; Timm, Henn; Böhmer, Jürgen

    2016-02-01

    Legislation in Europe has been adopted to determine and improve the ecological integrity of inland and coastal waters. Assessment is based on four biotic groups, including benthic macroinvertebrate communities. For lakes, benthic invertebrates have been recognized as one of the most difficult organism groups to use in ecological assessment, and hitherto their use in ecological assessment has been limited. In this study, we review and intercalibrate 13 benthic invertebrate-based tools across Europe. These assessment tools address different human impacts: acidification (3 methods), eutrophication (3 methods), morphological alterations (2 methods), and a combination of the last two (5 methods). For intercalibration, the methods were grouped into four intercalibration groups, according to the habitat sampled and putative pressure. Boundaries of the 'good ecological status' were compared and harmonized using direct or indirect comparison approaches. To enable indirect comparison of the methods, three common pressure indices and two common biological multimetric indices were developed for larger geographical areas. Additionally, we identified the best-performing methods based on their responsiveness to different human impacts. Based on these experiences, we provide practical recommendations for the development and harmonization of benthic invertebrate assessment methods in lakes and similar habitats. PMID:26580734

  5. A sensitive transcriptome analysis method that can detect unknown transcripts

    PubMed Central

    Fukumura, Ryutaro; Takahashi, Hirokazu; Saito, Toshiyuki; Tsutsumi, Yoko; Fujimori, Akira; Sato, Shinji; Tatsumi, Kouichi; Araki, Ryoko; Abe, Masumi

    2003-01-01

    We have developed an AFLP-based gene expression profiling method called ‘high coverage expression profiling’ (HiCEP) analysis. By making improvements to the selective PCR technique we have reduced the rate of false positive peaks to ∼4% and consequently the number of peaks, including overlapping peaks, has been markedly decreased. As a result we can determine the relationship between peaks and original transcripts unequivocally. This will make it practical to prepare a database of all peaks, allowing gene assignment without having to isolate individual peaks. This precise selection also enables us to easily clone peaks of interest and predict the corresponding gene for each peak in some species. The procedure is highly reproducible and sensitive enough to detect even a 1.2-fold difference in gene expression. Most importantly, the low false positive rate enables us to analyze gene expression with wide coverage by means of four instead of six nucleotide recognition site restriction enzymes for fingerprinting mRNAs. Therefore, the method detects 70–80% of all transcripts, including non-coding transcripts, unknown and known genes. Moreover, the method requires no sequence information and so is applicable even to eukaryotes for which there is no genome information available. PMID:12907746

  6. Concurrent implementation of the Crank-Nicolson method for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Ransom, J. B.; Fulton, R. E.

    1985-01-01

    To exploit the significant gains in computing speed provided by Multiple Instruction Multiple Data (MIMD) computers, concurrent methods for practical problems need to be investigated and test problems implemented on actual hardware. One such problem class is heat transfer analysis which is important in many aerospace applications. This paper compares the efficiency of two alternate implementations of heat transfer analysis on an experimental MIMD computer called the Finite Element Machine (FEM). The implicit Crank-Nicolson method is used to solve concurrently the heat transfer equations by both iterative and direct methods. Comparison of actual timing results achieved for the two methods and their significance relative to more complex problems are discussed.

  7. Method Development for Analysis of Aspirin Tablets.

    ERIC Educational Resources Information Center

    Street, Kenneth W., Jr.

    1988-01-01

    Develops a lab experiment for introductory instrumental analysis that requires interference studies and optimizing of conditions. Notes the analysis of the aspirin is by visible spectrophotometric assay. Gives experimental details and discussion. (MVL)

  8. Analysis and Practices of Teaching: Desciption of a Course.

    ERIC Educational Resources Information Center

    Etheridge, Carol Plata; And Others

    An introductory teacher preparation course based on Adler's Paideia concepts was examined for documentation of course content, purposes, and student reactions. Data were collected through ethnographic observations of course classes, interviews with students and professors, and examination of readings for the course. The course, "Analysis and…

  9. Suspension, Race, and Disability: Analysis of Statewide Practices and Reporting

    ERIC Educational Resources Information Center

    Krezmien, Michael P.; Leone, Peter E.; Achilles, Georgianna M.

    2006-01-01

    This analysis of statewide suspension data from 1995 to 2003 in Maryland investigated disproportionate suspensions of minority students and students with disabilities. We found substantial increases in over-all rates of suspensions from 1995 to 2003, as well as disproportionate rates of suspensions for African American students, American Indian…

  10. Digital Data Collection and Analysis: Application for Clinical Practice

    ERIC Educational Resources Information Center

    Ingram, Kelly; Bunta, Ferenc; Ingram, David

    2004-01-01

    Technology for digital speech recording and speech analysis is now readily available for all clinicians who use a computer. This article discusses some advantages of moving from analog to digital recordings and outlines basic recording procedures. The purpose of this article is to familiarize speech-language pathologists with computerized audio…

  11. Strategic planning for public health practice using macroenvironmental analysis.

    PubMed Central

    Ginter, P M; Duncan, W J; Capper, S A

    1991-01-01

    Macroenvironmental analysis is the initial stage in comprehensive strategic planning. The authors examine the benefits of this type of analysis when applied to public health organizations and present a series of questions that should be answered prior to committing resources to scanning, monitoring, forecasting, and assessing components of the macroenvironment. Using illustrations from the public and private sectors, each question is examined with reference to specific challenges facing public health. Benefits are derived both from the process and the outcome of macroenvironmental analysis. Not only are data acquired that assist public health professionals to make decisions, but the analytical process required assures a better understanding of potential external threats and opportunities as well as an organization's strengths and weaknesses. Although differences exist among private and public as well as profit and not-for-profit organizations, macroenvironmental analysis is seen as more essential to the public and not-for-profit sectors than the private and profit sectors. This conclusion results from the extreme dependency of those areas on external environmental forces that cannot be significantly influenced or controlled by public health decision makers. PMID:1902305

  12. Newborn Hearing Screening: An Analysis of Current Practices

    ERIC Educational Resources Information Center

    Houston, K. Todd; Bradham, Tamala S.; Munoz, Karen F.; Guignard, Gayla Hutsell

    2011-01-01

    State coordinators of early hearing detection and intervention (EHDI) programs completed a strengths, weaknesses, opportunities, and threats, or SWOT, analysis that consisted of 12 evaluative areas of EHDI programs. For the newborn hearing screening area, a total of 293 items were listed by 49 EHDI coordinators, and themes were identified within…

  13. The Analysis of Athletic Performance: Some Practical and Philosophical Considerations

    ERIC Educational Resources Information Center

    Nelson, Lee J.; Groom, Ryan

    2012-01-01

    This article presents a hypothetical dialogue between a notational analyst (NA) recently schooled in the positivistic assessment of athletic performance, an "old-school" traditional coach (TC) who favours subjective analysis, and a pragmatic educator (PE). The conversation opens with NA and TC debating the respective value of quantitative and…

  14. An Analysis of Ethical Considerations in Programme Design Practice

    ERIC Educational Resources Information Center

    Govers, Elly

    2014-01-01

    Ethical considerations are inherent to programme design decision-making, but not normally explicit. Nonetheless, they influence whose interests are served in a programme and who benefits from it. This paper presents an analysis of ethical considerations made by programme design practitioners in the context of a polytechnic in Aotearoa/New Zealand.…

  15. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL, as issued by the USDA,...

  16. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Methods of analysis. 133.5 Section 133.5 Food and... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture, milkfat, and phosphatase levels in cheeses will be determined by the following methods of analysis...

  17. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL as issued by the USDA, Agricultural...

  18. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Methods of analysis. 133.5 Section 133.5 Food and... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture, milkfat, and phosphatase levels in cheeses will be determined by the following methods of analysis...

  19. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Marketing Service, Dairy Programs, or the Official Methods of Analysis of the Association of Official... 7 Agriculture 3 2011-01-01 2011-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable...

  20. Adaptation of Cost Analysis Studies in Practice Guidelines.

    PubMed

    Zervou, Fainareti N; Zacharioudakis, Ioannis M; Pliakos, Elina Eleftheria; Grigoras, Christos A; Ziakas, Panayiotis D; Mylonakis, Eleftherios

    2015-12-01

    Clinical guidelines play a central role in day-to-day practice. We assessed the degree of incorporation of cost analyses to guidelines and identified modifiable characteristics that could affect the level of incorporation.We selected the 100 most cited guidelines listed on the National Guideline Clearinghouse (http://www.guideline.gov) and determined the number of guidelines that used cost analyses in their reasoning and the overall percentage of incorporation of relevant cost analyses available in PubMed. Differences between medical specialties were also studied. Then, we performed a case-control study using incorporated and not incorporated cost analyses after 1:1 matching by study subject and compared them by the Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement requirements and other criteria.We found that 57% of guidelines do not use any cost justification. Guidelines incorporate a weighted average of 6.0% (95% confidence interval [CI] 4.3-7.9) among 3396 available cost analyses, with cardiology and infectious diseases guidelines incorporating 10.8% (95% CI 5.3-18.1) and 9.9% (95% CI 3.9- 18.2), respectively, and hematology/oncology and urology guidelines incorporating 4.5% (95% CI 1.6-8.6) and 1.6% (95% CI 0.4-3.5), respectively. Based on the CHEERS requirements, the mean number of items reported by the 148 incorporated cost analyses was 18.6 (SD = 3.7), a small but significant difference over controls (17.8 items; P = 0.02). Included analyses were also more likely to directly relate cost reductions to healthcare outcomes (92.6% vs 81.1%, P = 0.004) and declare the funding source (72.3% vs 53.4%, P < 0.001), while similar number of cases and controls reported a noncommercial funding source (71% vs 72.7%; P = 0.8).Guidelines remain an underused mechanism for the cost-effective allocation of available resources and a minority of practice guidelines incorporates cost analyses utilizing only 6% of the available

  1. Thermal Analysis Methods For Earth Entry Vehicle

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.; Dec, John A.; Lindell, Michael C.

    2000-01-01

    Thermal analysis of a vehicle designed to return samples from another planet, such as the Earth Entry vehicle for the Mars Sample Return mission, presents several unique challenges. The Earth Entry Vehicle (EEV) must contain Martian material samples after they have been collected and protect them from the high heating rates of entry into the Earth's atmosphere. This requirement necessitates inclusion of detailed thermal analysis early in the design of the vehicle. This paper will describe the challenges and solutions for a preliminary thermal analysis of an Earth Entry Vehicle. The aeroheating on the vehicle during entry would be the main driver for the thermal behavior, and is a complex function of time, spatial position on the vehicle, vehicle temperature, and trajectory parameters. Thus, the thermal analysis must be closely tied to the aeroheating analysis in order to make accurate predictions. Also, the thermal analysis must account for the material response of the ablative thermal protection system (TPS). For the exo-atmospheric portion of the mission, the thermal analysis must include the orbital radiation fluxes on the surfaces. The thermal behavior must also be used to predict the structural response of the vehicle (the thermal stress and strains) and whether they remain within the capability of the materials. Thus, the thermal analysis requires ties to the three-dimensional geometry, the aeroheating analysis, the material response analysis, the orbital analysis, and the structural analysis. The goal of this paper is to describe to what degree that has been achieved.

  2. The uniform asymptotic swallowtail approximation - Practical methods for oscillating integrals with four coalescing saddle points

    NASA Technical Reports Server (NTRS)

    Connor, J. N. L.; Curtis, P. R.; Farrelly, D.

    1984-01-01

    Methods that can be used in the numerical implementation of the uniform swallowtail approximation are described. An explicit expression for that approximation is presented to the lowest order, showing that there are three problems which must be overcome in practice before the approximation can be applied to any given problem. It is shown that a recently developed quadrature method can be used for the accurate numerical evaluation of the swallowtail canonical integral and its partial derivatives. Isometric plots of these are presented to illustrate some of their properties. The problem of obtaining the arguments of the swallowtail integral from an analytical function of its argument is considered, describing two methods of solving this problem. The asymptotic evaluation of the butterfly canonical integral is addressed.

  3. Degradation of learned skills. Effectiveness of practice methods on simulated space flight skill retention

    NASA Technical Reports Server (NTRS)

    Sitterley, T. E.; Berge, W. A.

    1972-01-01

    Manual flight control and emergency procedure task skill degradation was evaluated after time intervals of from 1 to 6 months. The tasks were associated with a simulated launch through the orbit insertion flight phase of a space vehicle. The results showed that acceptable flight control performance was retained for 2 months, rapidly deteriorating thereafter by a factor of 1.7 to 3.1 depending on the performance measure used. Procedural task performance showed unacceptable degradation after only 1 month, and exceeded an order of magnitude after 4 months. The effectiveness of static rehearsal (checklists and briefings) and dynamic warmup (simulator practice) retraining methods were compared for the two tasks. Static rehearsal effectively countered procedural skill degradation, while some combination of dynamic warmup appeared necessary for flight control skill retention. It was apparent that these differences between methods were not solely a function of task type or retraining method, but were a function of the performance measures used for each task.

  4. Screening Workers: An Examination and Analysis of Practice and Public Policy.

    ERIC Educational Resources Information Center

    Greenfield, Patricia A.; And Others

    1989-01-01

    Discusses methods of screening job applicants and issues raised by screening procedures.. Includes legal ramifications, current practices in Britain and the United States, future directions, and the employment interview. (JOW)

  5. Integration of Formal Job Hazard Analysis & ALARA Work Practice

    SciTech Connect

    NELSEN, D.P.

    2002-09-01

    ALARA work practices have traditionally centered on reducing radiological exposure and controlling contamination. As such, ALARA policies and procedures are not well suited to a wide range of chemical and human health issues. Assessing relative risk, identifying appropriate engineering/administrative controls and selecting proper Personal Protective Equipment (PPE) for non nuclear work activities extends beyond the limitations of traditional ALARA programs. Forging a comprehensive safety management program in today's (2002) work environment requires a disciplined dialog between health and safety professionals (e.g. safety, engineering, environmental, quality assurance, industrial hygiene, ALARA, etc.) and personnel working in the field. Integrating organizational priorities, maintaining effective pre-planning of work and supporting a team-based approach to safety management represents today's hallmark of safety excellence. Relying on the mandates of any single safety program does not provide industrial hygiene with the tools necessary to implement an integrated safety program. The establishment of tools and processes capable of sustaining a comprehensive safety program represents a key responsibility of industrial hygiene. Fluor Hanford has built integrated safety management around three programmatic attributes: (1) Integration of radiological, chemical and ergonomic issues under a single program. (2) Continuous improvement in routine communications among work planning/scheduling, job execution and management. (3) Rapid response to changing work conditions, formalized work planning and integrated worker involvement.

  6. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  7. Spelling Practice Intervention: A Comparison of Tablet PC and Picture Cards as Spelling Practice Methods for Students with Developmental Disabilities

    ERIC Educational Resources Information Center

    Seok, Soonhwa; DaCosta, Boaventura; Yu, Byeong Min

    2015-01-01

    The present study compared a spelling practice intervention using a tablet personal computer (PC) and picture cards with three students diagnosed with developmental disabilities. An alternating-treatments design with a non-concurrent multiple-baseline across participants was used. The aims of the present study were: (a) to determine if…

  8. Meta-research: Evaluation and Improvement of Research Methods and Practices

    PubMed Central

    Ioannidis, John P. A.; Fanelli, Daniele; Dunne, Debbie Drake; Goodman, Steven N.

    2015-01-01

    As the scientific enterprise has grown in size and diversity, we need empirical evidence on the research process to test and apply interventions that make it more efficient and its results more reliable. Meta-research is an evolving scientific discipline that aims to evaluate and improve research practices. It includes thematic areas of methods, reporting, reproducibility, evaluation, and incentives (how to do, report, verify, correct, and reward science). Much work is already done in this growing field, but efforts to-date are fragmented. We provide a map of ongoing efforts and discuss plans for connecting the multiple meta-research efforts across science worldwide. PMID:26431313

  9. Meta-research: Evaluation and Improvement of Research Methods and Practices.

    PubMed

    Ioannidis, John P A; Fanelli, Daniele; Dunne, Debbie Drake; Goodman, Steven N

    2015-10-01

    As the scientific enterprise has grown in size and diversity, we need empirical evidence on the research process to test and apply interventions that make it more efficient and its results more reliable. Meta-research is an evolving scientific discipline that aims to evaluate and improve research practices. It includes thematic areas of methods, reporting, reproducibility, evaluation, and incentives (how to do, report, verify, correct, and reward science). Much work is already done in this growing field, but efforts to-date are fragmented. We provide a map of ongoing efforts and discuss plans for connecting the multiple meta-research efforts across science worldwide. PMID:26431313

  10. Three-dimensional Stress Analysis Using the Boundary Element Method

    NASA Technical Reports Server (NTRS)

    Wilson, R. B.; Banerjee, P. K.

    1984-01-01

    The boundary element method is to be extended (as part of the NASA Inelastic Analysis Methods program) to the three-dimensional stress analysis of gas turbine engine hot section components. The analytical basis of the method (as developed in elasticity) is outlined, its numerical implementation is summarized, and the approaches to be followed in extending the method to include inelastic material response indicated.

  11. Are larger dental practices more efficient? An analysis of dental services production.

    PubMed Central

    Lipscomb, J; Douglass, C W

    1986-01-01

    Whether cost-efficiency in dental services production increases with firm size is investigated through application of an activity analysis production function methodology to data from a national survey of dental practices. Under this approach, service delivery in a dental practice is modeled as a linear programming problem that acknowledges distinct input-output relationships for each service. These service-specific relationships are then combined to yield projections of overall dental practice productivity, subject to technical and organizational constraints. The activity analysis reported here represents arguably the most detailed evaluation yet of the relationship between dental practice size and cost-efficiency, controlling for such confounding factors as fee and service-mix differences across firms. We conclude that cost-efficiency does increase with practice size, over the range from solo to four-dentist practices. Largely because of data limitations, we were unable to test satisfactorily for scale economies in practices with five or more dentists. Within their limits, our findings are generally consistent with results from the neoclassical production function literature. From the standpoint of consumer welfare, the critical question raised (but not resolved) here is whether these apparent production efficiencies of group practice are ultimately translated by the market into lower fees, shorter queues, or other nonprice benefits. PMID:3102404

  12. Visceral fat estimation method by bioelectrical impedance analysis and causal analysis

    NASA Astrophysics Data System (ADS)

    Nakajima, Hiroshi; Tasaki, Hiroshi; Tsuchiya, Naoki; Hamaguchi, Takehiro; Shiga, Toshikazu

    2011-06-01

    It has been clarified that abdominal visceral fat accumulation is closely associated to the lifestyle disease and metabolic syndrome. The gold standard in medical fields is visceral fat area measured by an X-ray computer tomography (CT) scan or magnetic resonance imaging. However, their measurements are high invasive and high cost; especially a CT scan causes X-ray exposure. They are the reasons why medical fields need an instrument for viscera fat measurement with low invasive, ease of use, and low cost. The article proposes a simple and practical method of visceral fat estimation by employing bioelectrical impedance analysis and causal analysis. In the method, abdominal shape and dual impedances of abdominal surface and body total are measured to estimate a visceral fat area based on the cause-effect structure. The structure is designed according to the nature of abdominal body composition to be fine-tuned by statistical analysis. The experiments were conducted to investigate the proposed model. 180 subjects were hired to be measured by both a CT scan and the proposed method. The acquired model explained the measurement principle well and the correlation coefficient is 0.88 with the CT scan measurements.

  13. Methods for analysis of fluoroquinolones in biological fluids

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Methods for analysis of 10 selected fluoroquinolone antibiotics in biological fluids are reviewed. Approaches for sample preparation, detection methods, limits of detection and quantitation and recovery information are provided for both single analyte and multi-analyte fluoroquinolone methods....

  14. Knowledge-attitude-practice survey among Portuguese gynaecologists regarding combined hormonal contraceptives methods.

    PubMed

    Bombas, Teresa; Costa, Ana Rosa; Palma, Fátima; Vicente, Lisa; Sá, José Luís; Nogueira, Ana Maria; Andrade, Sofia

    2012-04-01

    ABSTRACT Objectives To evaluate knowledge, attitude and practices of Portuguese gynaecologists regarding combined hormonal contraceptives. Methods A cross-sectional survey was conducted among 303 gynaecologists. Results Ninety percent of the gynaecologists considered that deciding on contraceptive methods is a process wherein the woman has her say. Efficacy, safety and the woman's preference were the major factors influencing gynaecologists, while efficacy, tolerability and ease of use were the major factors perceived by the specialists to influence the women's choice. Gynaecologists believed that only 2% of women taking the pill were 100% compliant compared to 48% of those using the patch and 75% of those using the ring. The lower risk of omission was the strong point for the latter methods. Side effects were the main reason to change to another method. Vaginal manipulation was the most difficult topic to discuss. Conclusions Most gynaecologists decided with the woman on the contraceptive method. The main reasons for the gynaecologist's recommendation of a given contraceptive method and the women's choice were different. Counselling implies an open discussion and topics related to sexuality were considered difficult to discuss. Improving communication skills and understanding women's requirements are critical for contraceptive counselling. PMID:22200109

  15. Practical Application of Parallel Coordinates for Climate Model Analysis

    SciTech Connect

    Steed, Chad A; Shipman, Galen M; Thornton, Peter E; Ricciuto, Daniel M; Erickson III, David J; Branstetter, Marcia L

    2012-01-01

    The determination of relationships between climate variables and the identification of the most significant associations between them in various geographic regions is an important aspect of climate model evaluation. The EDEN visual analytics toolkit has been developed to aid such analysis by facilitating the assessment of multiple variables with respect to the amount of variability that can be attributed to specific other variables. EDEN harnesses the parallel coordinates visualization technique and is augmented with graphical indicators of key descriptive statistics. A case study is presented in which the focus on the Harvard Forest site (42.5378N Lat, 72.1715W Lon) and the Community Land Model Version 4 (CLM4) is evaluated. It is shown that model variables such as land water runoff are more sensitive to a particular set of environmental variables than a suite of other inputs in the 88 variable analysis conducted. The approach presented here allows climate-domain scientists to focus on the most important variables in the model evaluations.

  16. Why and How Do Nursing Homes Implement Culture Change Practices? Insights from Qualitative Interviews in a Mixed Methods Study

    PubMed Central

    Shield, Renée R.; Looze, Jessica; Tyler, Denise; Lepore, Michael; Miller, Susan C.

    2015-01-01

    Objective To understand the process of instituting culture change (CC) practices in nursing homes (NHs). Methods NH Directors of Nursing (DONs) and Administrators (NHAs) at 4,149 United States NHs were surveyed about CC practices. Follow-up interviews with 64 NHAs were conducted and analyzed by a multidisciplinary team which reconciled interpretations recorded in an audit trail. Results The themes include: 1) Reasons for implementing CC practices vary; 2) NH approaches to implementing CC practices are diverse; 3) NHs consider resident mix in deciding to implement practices; 4) NHAs note benefits and few implementation costs of implementing CC practices; 5) Implementation of changes is challenging and strategies for change are tailored to the challenges encountered; 6) Education and communication efforts are vital ways to institute change; and 7) NHA and other staff leadership is key to implementing changes. Discussion Diverse strategies and leadership skills appear to help NHs implement reform practices, including CC innovations. PMID:24652888

  17. A Situated Practice of Ethics for Participatory Visual and Digital Methods in Public Health Research and Practice: A Focus on Digital Storytelling

    PubMed Central

    Hill, Amy L.; Flicker, Sarah

    2014-01-01

    This article explores ethical considerations related to participatory visual and digital methods for public health research and practice, through the lens of an approach known as “digital storytelling.” We begin by briefly describing the digital storytelling process and its applications to public health research and practice. Next, we explore 6 common challenges: fuzzy boundaries, recruitment and consent to participate, power of shaping, representation and harm, confidentiality, and release of materials. We discuss their complexities and offer some considerations for ethical practice. We hope this article serves as a catalyst for expanded dialogue about the need for high standards of integrity and a situated practice of ethics wherein researchers and practitioners reflexively consider ethical decision-making as part of the ongoing work of public health. PMID:23948015

  18. Methods for Mediation Analysis with Missing Data

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Wang, Lijuan

    2013-01-01

    Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum…

  19. EIA practice in India and its evaluation using SWOT analysis

    SciTech Connect

    Paliwal, Ritu . E-mail: ritup@terischool.ac.in

    2006-07-15

    In India Environmental Impact Assessment (EIA) has been formally introduced in 1994. It relied on the institutional framework that has a strong supporting legislative, administrative and procedural set-up. Both central and state authorities together are sharing the responsibility of its development and management. A Strength, Weakness, Opportunity and Threat (SWOT) analysis taken up in this article has suggested that there are several issues that need to be readdressed. It highlights several constraints, ranging from improper screening and scoping guidelines to ineffective monitoring and post project evaluation. The opportunities are realised as increasing public awareness, initiatives of environmental groups and business community and forward thinking to integrate environmental consideration into plans and policies. Poor governance, rapid economic reforms, and favours to small-scale units are some of the foreseen threats to the system. This article concludes with some suggestions to improve EIA process in India.

  20. Testing for Questionable Research Practices in a Meta-Analysis: An Example from Experimental Parapsychology.

    PubMed

    Bierman, Dick J; Spottiswoode, James P; Bijl, Aron

    2016-01-01

    We describe a method of quantifying the effect of Questionable Research Practices (QRPs) on the results of meta-analyses. As an example we simulated a meta-analysis of a controversial telepathy protocol to assess the extent to which these experimental results could be explained by QRPs. Our simulations used the same numbers of studies and trials as the original meta-analysis and the frequencies with which various QRPs were applied in the simulated experiments were based on surveys of experimental psychologists. Results of both the meta-analysis and simulations were characterized by 4 metrics, two describing the trial and mean experiment hit rates (HR) of around 31%, where 25% is expected by chance, one the correlation between sample-size and hit-rate, and one the complete P-value distribution of the database. A genetic algorithm optimized the parameters describing the QRPs, and the fitness of the simulated meta-analysis was defined as the sum of the squares of Z-scores for the 4 metrics. Assuming no anomalous effect a good fit to the empirical meta-analysis was found only by using QRPs with unrealistic parameter-values. Restricting the parameter space to ranges observed in studies of QRP occurrence, under the untested assumption that parapsychologists use comparable QRPs, the fit to the published Ganzfeld meta-analysis with no anomalous effect was poor. We allowed for a real anomalous effect, be it unidentified QRPs or a paranormal effect, where the HR ranged from 25% (chance) to 31%. With an anomalous HR of 27% the fitness became F = 1.8 (p = 0.47 where F = 0 is a perfect fit). We conclude that the very significant probability cited by the Ganzfeld meta-analysis is likely inflated by QRPs, though results are still significant (p = 0.003) with QRPs. Our study demonstrates that quantitative simulations of QRPs can assess their impact. Since meta-analyses in general might be polluted by QRPs, this method has wide applicability outside the domain of experimental

  1. Testing for Questionable Research Practices in a Meta-Analysis: An Example from Experimental Parapsychology

    PubMed Central

    Bierman, Dick J.; Spottiswoode, James P.; Bijl, Aron

    2016-01-01

    We describe a method of quantifying the effect of Questionable Research Practices (QRPs) on the results of meta-analyses. As an example we simulated a meta-analysis of a controversial telepathy protocol to assess the extent to which these experimental results could be explained by QRPs. Our simulations used the same numbers of studies and trials as the original meta-analysis and the frequencies with which various QRPs were applied in the simulated experiments were based on surveys of experimental psychologists. Results of both the meta-analysis and simulations were characterized by 4 metrics, two describing the trial and mean experiment hit rates (HR) of around 31%, where 25% is expected by chance, one the correlation between sample-size and hit-rate, and one the complete P-value distribution of the database. A genetic algorithm optimized the parameters describing the QRPs, and the fitness of the simulated meta-analysis was defined as the sum of the squares of Z-scores for the 4 metrics. Assuming no anomalous effect a good fit to the empirical meta-analysis was found only by using QRPs with unrealistic parameter-values. Restricting the parameter space to ranges observed in studies of QRP occurrence, under the untested assumption that parapsychologists use comparable QRPs, the fit to the published Ganzfeld meta-analysis with no anomalous effect was poor. We allowed for a real anomalous effect, be it unidentified QRPs or a paranormal effect, where the HR ranged from 25% (chance) to 31%. With an anomalous HR of 27% the fitness became F = 1.8 (p = 0.47 where F = 0 is a perfect fit). We conclude that the very significant probability cited by the Ganzfeld meta-analysis is likely inflated by QRPs, though results are still significant (p = 0.003) with QRPs. Our study demonstrates that quantitative simulations of QRPs can assess their impact. Since meta-analyses in general might be polluted by QRPs, this method has wide applicability outside the domain of experimental

  2. Multiscale Methods for Nuclear Reactor Analysis

    NASA Astrophysics Data System (ADS)

    Collins, Benjamin S.

    The ability to accurately predict local pin powers in nuclear reactors is necessary to understand the mechanisms that cause fuel pin failure during steady state and transient operation. In the research presented here, methods are developed to improve the local solution using high order methods with boundary conditions from a low order global solution. Several different core configurations were tested to determine the improvement in the local pin powers compared to the standard techniques, that use diffusion theory and pin power reconstruction (PPR). Two different multiscale methods were developed and analyzed; the post-refinement multiscale method and the embedded multiscale method. The post-refinement multiscale methods use the global solution to determine boundary conditions for the local solution. The local solution is solved using either a fixed boundary source or an albedo boundary condition; this solution is "post-refinement" and thus has no impact on the global solution. The embedded multiscale method allows the local solver to change the global solution to provide an improved global and local solution. The post-refinement multiscale method is assessed using three core designs. When the local solution has more energy groups, the fixed source method has some difficulties near the interface: however the albedo method works well for all cases. In order to remedy the issue with boundary condition errors for the fixed source method, a buffer region is used to act as a filter, which decreases the sensitivity of the solution to the boundary condition. Both the albedo and fixed source methods benefit from the use of a buffer region. Unlike the post-refinement method, the embedded multiscale method alters the global solution. The ability to change the global solution allows for refinement in areas where the errors in the few group nodal diffusion are typically large. The embedded method is shown to improve the global solution when it is applied to a MOX/LEU assembly

  3. A simple and practical method that prepares high molecular weight DNA ladders.

    PubMed

    Zhang, Jun-He; Yang, Rui; Wang, Tian-Yun; Dong, Wei-Hua; Wang, Fang; Wang, Li

    2012-11-01

    The purpose of the current study was to report a simple and practical method to prepare high molecular weight (mw) DNA ladders. The method involves 1,000-4,000-base pairs (bp) DNA fragments being amplified by polymerase chain reaction (PCR), using λ DNA as a template. The constructed plasmids are digested by restriction endonucleases to produce 5-, 6-, 8- and 10-kb DNA fragments, followed by purification and precipitation with ethanol, and mixed proportionally. The 1,000-4,000-bp DNA fragments were successfully generated by PCR and 5-, 6-, 8- and 10-kb DNA fragments were obtained through the digestion of the plasmids. The bands of the prepared high mw DNA ladder were clear and may aid future molecular biology studies. PMID:22948498

  4. Exploration of Methods Used by Pharmacy Professional Programs to Contract with Experiential Practice Sites

    PubMed Central

    Garavalia, Linda; Gubbins, Paul O.; Ruehter, Valerie

    2016-01-01

    Objective. To explore methods used by pharmacy programs to attract and sustain relationships with preceptors and experiential practice sites. Methods. Interviews with eight focus groups of pharmacy experiential education experts (n=35) were conducted at two national pharmacy meetings. A semi-structured interview guide was used. Focus group interviews were recorded, transcribed verbatim, and categorically coded independently by two researchers. Codes were compared, consensus was reached through discussion, and two experiential education experts assisted with interpretation of the coded data. Results. Six themes emerged consistently across focus groups: a perceived increase in preceptor compensation, intended vs actual use of payments by sites, concern over renegotiation of established compensation, costs and benefits of experiential students, territorialism, and motives. Conclusion. Fostering a culture of collaboration may counteract potentially competitive strategies to gain sites. Participants shared a common interest in providing high-quality experiential learning where sites and preceptors participated for altruistic reasons, rather than compensation. PMID:27073279

  5. A method for obtaining practical flutter-suppression control laws using results of optimal control theory

    NASA Technical Reports Server (NTRS)

    Newson, J. R.

    1979-01-01

    The results of optimal control theory are used to synthesize a feedback filter. The feedback filter is used to force the output of the filtered frequency response to match that of a desired optimal frequency response over a finite frequency range. This matching is accomplished by employing a nonlinear programing algorithm to search for the coefficients of the feedback filter that minimize the error between the optimal frequency response and the filtered frequency response. The method is applied to the synthesis of an active flutter-suppression control law for an aeroelastic wind-tunnel model. It is shown that the resulting control law suppresses flutter over a wide range of subsonic Mach numbers. This is a promising method for synthesizing practical control laws using the results of optimal control theory.

  6. A Practical Method for Multi-Objective Scheduling through Soft Computing Approach

    NASA Astrophysics Data System (ADS)

    Shimizu, Yoshiaki; Tanaka, Yasutsugu

    Due to diversified customer demands and global competition, scheduling has been increasingly notified as an important problem-solving in manufacturing. Since the scheduling is considered at stage close to the practical operation in production planning, flexibility and agility in decision making should be most important in real world applications. In addition, since the final goal of such scheduling has many attributes, and their relative importance is likely changed depending on the decision environment, it is of great significance to derive a flexible scheduling through plain multi-objective optimization method. To derive such a rational scheduling, in this paper, we have applied a novel multi-objective optimization named MOON2R (MOON2 of radial basis function) by incorporating with simulated annealing as a solution algorithm. Finally, illustrative examples are provided to outline and verify the effectiveness of the proposed method.

  7. Exploration of Methods Used by Pharmacy Professional Programs to Contract with Experiential Practice Sites.

    PubMed

    Brownfield, Angela; Garavalia, Linda; Gubbins, Paul O; Ruehter, Valerie

    2016-03-25

    Objective. To explore methods used by pharmacy programs to attract and sustain relationships with preceptors and experiential practice sites. Methods. Interviews with eight focus groups of pharmacy experiential education experts (n=35) were conducted at two national pharmacy meetings. A semi-structured interview guide was used. Focus group interviews were recorded, transcribed verbatim, and categorically coded independently by two researchers. Codes were compared, consensus was reached through discussion, and two experiential education experts assisted with interpretation of the coded data. Results. Six themes emerged consistently across focus groups: a perceived increase in preceptor compensation, intended vs actual use of payments by sites, concern over renegotiation of established compensation, costs and benefits of experiential students, territorialism, and motives. Conclusion. Fostering a culture of collaboration may counteract potentially competitive strategies to gain sites. Participants shared a common interest in providing high-quality experiential learning where sites and preceptors participated for altruistic reasons, rather than compensation. PMID:27073279

  8. [The Application of the Fault Tree Analysis Method in Medical Equipment Maintenance].

    PubMed

    Liu, Hongbin

    2015-11-01

    In this paper, the traditional fault tree analysis method is presented, detailed instructions for its application characteristics in medical instrument maintenance is made. It is made significant changes when the traditional fault tree analysis method is introduced into the medical instrument maintenance: gave up the logic symbolic, logic analysis and calculation, gave up its complicated programs, and only keep its image and practical fault tree diagram, and the fault tree diagram there are also differences: the fault tree is no longer a logical tree but the thinking tree in troubleshooting, the definition of the fault tree's nodes is different, the composition of the fault tree's branches is also different. PMID:27066693

  9. International Commercial Remote Sensing Practices and Policies: A Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Stryker, Timothy

    In recent years, there has been much discussion about U.S. commercial remoteUnder the Act, the Secretary of Commerce sensing policies and how effectively theylicenses the operations of private U.S. address U.S. national security, foreignremote sensing satellite systems, in policy, commercial, and public interests.consultation with the Secretaries of Defense, This paper will provide an overview of U.S.State, and Interior. PDD-23 provided further commercial remote sensing laws,details concerning the operation of advanced regulations, and policies, and describe recentsystems, as well as criteria for the export of NOAA initiatives. It will also addressturnkey systems and/or components. In July related foreign practices, and the overall2000, pursuant to the authority delegated to legal context for trade and investment in thisit by the Secretary of Commerce, NOAA critical industry.iss ued new regulations for the industry. Licensing and Regulationsatellite systems. NOAA's program is The 1992 Land Remote Sensing Policy Act ("the Act"), and the 1994 policy on Foreign Access to Remote Sensing Space Capabilities (known as Presidential Decision Directive-23, or PDD-23) put into place an ambitious legal and policy framework for the U.S. Government's licensing of privately-owned, high-resolution satellite systems. Previously, capabilities afforded national security and observes the international obligations of the United States; maintain positive control of spacecraft operations; maintain a tasking record in conjunction with other record-keeping requirements; provide U.S. Government access to and use of data when required for national security or foreign policy purposes; provide for U.S. Government review of all significant foreign agreements; obtain U.S. Government approval for any encryption devices used; make available unenhanced data to a "sensed state" as soon as such data are available and on reasonable cost terms and conditions; make available unenhanced data as requested

  10. Cumulative radiation effect. Part VI: simple nomographic and tabular methods for the solution of practical problems.

    PubMed

    Kirk, J; Gray, W M; Watson, E R

    1977-01-01

    In five previous papers, the concept of the Cumulative Radiation Effect (CRE) has been presented as a scale of accumulative sub-tolerance radiation damage. The biological effect generated in normal connective tissue by fractionated or continuous radiation therapy given in any temporal arrangement is described by the CRE on a unified scale of assessment, so that a unique value of the CRE describes a specific level of radiation effect. The basic methods of evaluating CREs were shown in these papers to facilitate a full understanding of the fundamental aspects of the CRE-system, but these methods can be time-consuming and tediuous for complex situations. In this paper, simple nomographic and tabular methods for the solution of practical problems are presented. An essential feature of solving a CRE problem is firstly to present it in a concise and readily appreciated form, and, to do this, nomenclature is introduced to describe schedules and regimes as compactly as possible. Simple algebraic equations are derived to describe the CRE achieved by multi-schedule regimes. In these equations, the equivalence conditions existing at the junctions between schedules are not explicit and the equations are based on the CREs of the constituent schedules assessed individually without reference to their context in the regime as a whole. This independent evaluations of CREs for each schedule results in a considerable simplification in the calculation of complex problems. The calculations are further simplified by the use of suitable tables and nomograms, so that the mathematics involved is reduced to simple arithmetical operations which require at the most the use of a slide rule but can be done by hand. The order of procedure in the presentation and calculation of CRE problems can be summarised in an evaluation procedure sheet. The resulting simple methods for solving practical problems of any complexity on the CRE-system are demonstrated by a number of examples. PMID:856533

  11. A practical method for depth of interaction determination in monolithic scintillator PET detectors.

    PubMed

    van Dam, Herman T; Seifert, Stefan; Vinke, Ruud; Dendooven, Peter; Löhner, Herbert; Beekman, Freek J; Schaart, Dennis R

    2011-07-01

    Several new methods for determining the depth of interaction (DOI) of annihilation photons in monolithic scintillator detectors with single-sided, multi-pixel readout are investigated. The aim is to develop a DOI decoding method that allows for practical implementation in a positron emission tomography system. Specifically, calibration data, obtained with perpendicularly incident gamma photons only, are being used. Furthermore, neither detector modifications nor a priori knowledge of the light transport and/or signal variances is required. For this purpose, a clustering approach is utilized in combination with different parameters correlated with the DOI, such as the degree of similarity to a set of reference light distributions, the measured intensity on the sensor pixel(s) closest to the interaction position and the peak intensity of the measured light distribution. The proposed methods were tested experimentally on a detector comprised of a 20 mm × 20 mm × 12 mm polished LYSO:Ce crystal coupled to a 4 × 4 multi-anode photomultiplier. The method based on the linearly interpolated measured intensities on the sensor pixels closest to the estimated (x, y)-coordinate outperformed the other methods, yielding DOI resolutions between ∼1 and ∼4.5 mm FWHM depending on the DOI, the (x, y) resolution and the amount of reference data used. PMID:21693789

  12. Transfusion monitoring: care practice analysis in a public teaching hospital

    PubMed Central

    dos Reis, Valesca Nunes; Paixão, Isabella Bertolin; Perrone, Ana Carolina Amaral de São José; Monteiro, Maria Inês; dos Santos, Kelli Borges

    2016-01-01

    ABSTRACT Objective To analyze the process of recording transfusion monitoring at a public teaching hospital. Methods A descriptive and retrospective study with a quantitative approach, analyzing the instruments to record transfusion monitoring at a public hospital in a city in the State of Minas Gerais (MG). Data were collected on the correct completion of the instrument, time elapsed from transfusions, records of vital signs, type of blood component more frequently transfused, and hospital unit where transfusion was performed. Results A total of 1,012 records were analyzed, and 53.4% of them had errors in filling in the instruments, 6% of transfusions started after the recommended time, and 9.3% of patients had no vital signs registered. Conclusion Failures were identified in the process of recording transfusion monitoring, and they could result in more adverse events related to the administration of blood components. Planning and implementing strategies to enhance recording and to improve care delivered are challenging. PMID:27074233

  13. Numerical Analysis of the Symmetric Methods

    NASA Astrophysics Data System (ADS)

    Xu, Ji-Hong; Zhang, A.-Li

    1995-03-01

    Aimed at the initial value problem of the particular second-order ordinary differential equations,y ″=f(x, y), the symmetric methods (Quinlan and Tremaine, 1990) and our methods (Xu and Zhang, 1994) have been compared in detail by integrating the artificial earth satellite orbits in this paper. In the end, we point out clearly that the integral accuracy of numerical integration of the satellite orbits by applying our methods is obviously higher than that by applying the same order formula of the symmetric methods when the integration time-interval is not greater than 12000 periods.

  14. [Place of reflexotherapy and some other methods of alternative medicine in modern medical practice].

    PubMed

    Boĭchak, M P; Sobetskiĭ, V V

    2010-01-01

    Assessment of the role and place of nontraditional methods of treatment and reflexotherapy, widely applied in hospitals is presented in the article. Besides, we become alerted regarding not serious approach of some scientists and health service managers to reflexotherapy as a whole and to one of its methods--acupuncture. An analysis of the situation developed in the legislation concerning training of reflexotherapy specialists for last 15-20 years not only in Ukraine, but also abroad was done. The article presents a historical parallel between the use of medicamentous and nonmedicamentous methods of treatment. PMID:20608024

  15. Reconciling Data from Different Sources: Practical Realities of Using Mixed Methods to Identify Effective High School Practices

    ERIC Educational Resources Information Center

    Smith, Thomas M.; Cannata, Marisa; Haynes, Katherine Taylor

    2016-01-01

    Background/Context: Mixed methods research conveys multiple advantages to the study of complex phenomena and large organizations or systems. The benefits are derived from drawing on the strengths of qualitative methods to answer questions about how and why a phenomenon occurs and those of quantitative methods to examine how often a phenomenon…

  16. Practice Makes Perfect: Improving Students' Skills in Understanding and Avoiding Plagiarism with a Themed Methods Course

    ERIC Educational Resources Information Center

    Estow, Sarah; Lawrence, Eva K.; Adams, Kathrynn A.

    2011-01-01

    To address the issue of plagiarism, students in two undergraduate Research Methods and Analysis courses conducted, analyzed, and wrote up original research on the topic of plagiarism. Students in an otherwise identical course completed the same assignments but examined a different research topic. At the start and end of the semester, all students…

  17. A practical approach to object based requirements analysis

    NASA Technical Reports Server (NTRS)

    Drew, Daniel W.; Bishop, Michael

    1988-01-01

    Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.

  18. Practical estimates of field-saturated hydraulic conductivity of bedrock outcrops using a modified bottomless bucket method

    USGS Publications Warehouse

    Mirus, Benjamin B.; Perkins, Kim S.

    2012-01-01

    The bottomless bucket (BB) approach (Nimmo et al., 2009a) is a cost-effective method for rapidly characterizing field-saturated hydraulic conductivity Kfs of soils and alluvial deposits. This practical approach is of particular value for quantifying infiltration rates in remote areas with limited accessibility. A similar approach for bedrock outcrops is also of great value for improving quantitative understanding of infiltration and recharge in rugged terrain. We develop a simple modification to the BB method for application to bedrock outcrops, which uses a non-toxic, quick-drying silicone gel to seal the BB to the bedrock. These modifications to the field method require only minor changes to the analytical solution for calculating Kfs on soils. We investigate the reproducibility of the method with laboratory experiments on a previously studied calcarenite rock and conduct a sensitivity analysis to quantify uncertainty in our predictions. We apply the BB method on both bedrock and soil for sites on Pahute Mesa, which is located in a remote area of the Nevada National Security Site. The bedrock BB tests may require monitoring over several hours to days, depending on infiltration rates, which necessitates a cover to prevent evaporative losses. Our field and laboratory results compare well to Kfs values inferred from independent reports, which suggests the modified BB method can provide useful estimates and facilitate simple hypothesis testing. The ease with which the bedrock BB method can be deployed should facilitate more rapid in-situ data collection than is possible with alternative methods for quantitative characterization of infiltration into bedrock.

  19. A catalog of automated analysis methods for enterprise models.

    PubMed

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool. PMID:27047732

  20. Reforming High School Science for Low-Performing Students Using Inquiry Methods and Communities of Practice

    NASA Astrophysics Data System (ADS)

    Bolden, Marsha Gail

    Some schools fall short of the high demand to increase science scores on state exams because low-performing students enter high school unprepared for high school science. Low-performing students are not successful in high school for many reasons. However, using inquiry methods have improved students' understanding of science concepts. The purpose of this qualitative research study was to investigate the teachers' lived experiences with using inquiry methods to motivate low-performing high school science students in an inquiry-based program called Xtreem Science. Fifteen teachers were selected from the Xtreem Science program, a program designed to assist teachers in motivating struggling science students. The research questions involved understanding (a) teachers' experiences in using inquiry methods, (b) challenges teachers face in using inquiry methods, and (c) how teachers describe student's response to inquiry methods. Strategy of data collection and analysis included capturing and understanding the teachers' feelings, perceptions, and attitudes in their lived experience of teaching using inquiry method and their experience in motivating struggling students. Analysis of interview responses revealed teachers had some good experiences with inquiry and expressed that inquiry impacted their teaching style and approach to topics, and students felt that using inquiry methods impacted student learning for the better. Inquiry gave low-performing students opportunities to catch up and learn information that moved them to the next level of science courses. Implications for positive social change include providing teachers and school district leaders with information to help improve performance of the low performing science students.