Science.gov

Sample records for practical analysis method

  1. A Practical Method of Policy Analysis by Simulating Policy Options

    ERIC Educational Resources Information Center

    Phelps, James L.

    2011-01-01

    This article focuses on a method of policy analysis that has evolved from the previous articles in this issue. The first section, "Toward a Theory of Educational Production," identifies concepts from science and achievement production to be incorporated into this policy analysis method. Building on Kuhn's (1970) discussion regarding paradigms, the…

  2. Practical Use of Computationally Frugal Model Analysis Methods

    SciTech Connect

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugal methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts

  3. Practical Use of Computationally Frugal Model Analysis Methods

    DOE PAGESBeta

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less

  4. Practical Use of Computationally Frugal Model Analysis Methods.

    PubMed

    Hill, Mary C; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2016-03-01

    Three challenges compromise the utility of mathematical models of groundwater and other environmental systems: (1) a dizzying array of model analysis methods and metrics make it difficult to compare evaluations of model adequacy, sensitivity, and uncertainty; (2) the high computational demands of many popular model analysis methods (requiring 1000's, 10,000 s, or more model runs) make them difficult to apply to complex models; and (3) many models are plagued by unrealistic nonlinearities arising from the numerical model formulation and implementation. This study proposes a strategy to address these challenges through a careful combination of model analysis and implementation methods. In this strategy, computationally frugal model analysis methods (often requiring a few dozen parallelizable model runs) play a major role, and computationally demanding methods are used for problems where (relatively) inexpensive diagnostics suggest the frugal methods are unreliable. We also argue in favor of detecting and, where possible, eliminating unrealistic model nonlinearities-this increases the realism of the model itself and facilitates the application of frugal methods. Literature examples are used to demonstrate the use of frugal methods and associated diagnostics. We suggest that the strategy proposed in this paper would allow the environmental sciences community to achieve greater transparency and falsifiability of environmental models, and obtain greater scientific insight from ongoing and future modeling efforts. PMID:25810333

  5. A Practical Method of Policy Analysis by Estimating Effect Size

    ERIC Educational Resources Information Center

    Phelps, James L.

    2011-01-01

    The previous articles on class size and other productivity research paint a complex and confusing picture of the relationship between policy variables and student achievement. Missing is a conceptual scheme capable of combining the seemingly unrelated research and dissimilar estimates of effect size into a unified structure for policy analysis and…

  6. A Topography Analysis Incorporated Optimization Method for the Selection and Placement of Best Management Practices

    PubMed Central

    Shen, Zhenyao; Chen, Lei; Xu, Liang

    2013-01-01

    Best Management Practices (BMPs) are one of the most effective methods to control nonpoint source (NPS) pollution at a watershed scale. In this paper, the use of a topography analysis incorporated optimization method (TAIOM) was proposed, which integrates topography analysis with cost-effective optimization. The surface status, slope and the type of land use were evaluated as inputs for the optimization engine. A genetic algorithm program was coded to obtain the final optimization. The TAIOM was validated in conjunction with the Soil and Water Assessment Tool (SWAT) in the Yulin watershed in Southwestern China. The results showed that the TAIOM was more cost-effective than traditional optimization methods. The distribution of selected BMPs throughout landscapes comprising relatively flat plains and gentle slopes, suggests the need for a more operationally effective scheme, such as the TAIOM, to determine the practicability of BMPs before widespread adoption. The TAIOM developed in this study can easily be extended to other watersheds to help decision makers control NPS pollution. PMID:23349917

  7. Practical method for radioactivity distribution analysis in small-animal PET cancer studies

    PubMed Central

    Slavine, Nikolai V.; Antich, Peter P.

    2008-01-01

    We present a practical method for radioactivity distribution analysis in small-animal tumors and organs using positron emission tomography imaging with a calibrated source of known activity and size in the field of view. We reconstruct the imaged mouse together with a source under the same conditions, using an iterative method, Maximum Likelihood Expectation-Maximization with System Modeling, capable of delivering high resolution images. Corrections for the ratios of geometrical efficiencies, radioisotope decay in time and photon attenuation are included in the algorithm. We demonstrate reconstruction results for the amount of radioactivity within the scanned mouse in a sample study of osteolytic and osteoblastic bone metastasis from prostate cancer xenografts. Data acquisition was performed on the small-animal PET system which was tested with different radioactive sources, phantoms and animals to achieve high sensitivity and spatial resolution. Our method uses high resolution images to determine the volume of organ or tumor and the amount of their radioactivity, has the possibility of saving time, effort and the necessity to sacrifice animals. This method has utility for prognosis and quantitative analysis in small-animal cancer studies, and will enhance the assessment of characteristics of tumor growth, identifying metastases, and potentially determining the effectiveness of cancer treatment. The possible application for this technique could be useful for the organ radioactivity dosimetry studies. PMID:18667322

  8. Methods and practices used in incident analysis in the Finnish nuclear power industry.

    PubMed

    Suksi, Seija

    2004-07-26

    Finnish nuclear power plant operators Teollisuuden Voima Oy (TVO) and Fortum Power and Heat Oy (Fortum) was carried out by the Technical Research Centre (VTT) on request of STUK at the end of 1990s. The study aimed at providing a broad overview and suggestions for improvement of the whole organisational framework to support event investigation practices at the regulatory body and at the utilities. The main objective of the research was to evaluate the adequacy and reliability of event investigation analysis methods and practices in the Finnish nuclear power industry and based on the results to further develop them. The results and suggestions of the research are reviewed in the paper and the corrective actions implemented in event investigation and operating experience procedures both at STUK and at utilities are discussed as well. STUK has developed its own procedure for the risk-informed analysis of nuclear power plant events. The PSA based event analysis method is used to assess the safety significance and importance measures associated with the unavailability of components and systems subject to Technical Specifications. The insights from recently performed PSA based analyses are also briefly discussed in the paper. PMID:15231350

  9. Practical Methods for the Analysis of Voltage Collapse in Electric Power Systems: a Stationary Bifurcations Viewpoint.

    NASA Astrophysics Data System (ADS)

    Jean-Jumeau, Rene

    1993-03-01

    Voltage collapse (VC) is generally caused by either of two types of system disturbances: load variations and contingencies. In this thesis, we study VC resulting from load variations. This is termed static voltage collapse. This thesis deals with this type of voltage collapse in electrical power systems by using a stationary bifurcations viewpoint by associating it with the occurrence of saddle node bifurcations (SNB) in the system. Approximate models are generically used in most VC analyses. We consider the validity of these models for the study of SNB and, thus, of voltage collapse. We justify the use of saddle node bifurcation as a model for VC in power systems. In particular, we prove that this leads to definition of a model and--since load demand is used as a parameter for that model--of a mode of parameterization of that model in order to represent actual power demand variations within the power system network. Ill-conditioning of the set of nonlinear equations defining a dynamical system is a generic occurence near the SNB point. We suggest a reparameterization of the set of nonlinear equations which allows to avoid this problem. A new indicator for the proximity of voltage collapse, the voltage collapse index (VCI), is developed. A new (n + 1)-dimensional set of characteristic equations for the computation of the exact SNB point, replacing the standard (2n + 1)-dimensional one is presented for general parameter -dependent nonlinear dynamical systems. These results are then applied to electric power systems for the analysis and prediction of voltage collapse. The new methods offer the potential of faster computation and greater flexibility. For reasons of theoretical development and clarity, the preceding methodologies are developed under the assumption of the absence of constraints on the system parameters and states, and the full differentiability of the functions defining the power system model. In the latter part of this thesis, we relax these

  10. Analysis of the upper massif of the craniofacial with the radial methodpractical use

    PubMed Central

    Lepich, Tomasz; Dąbek, Józefa; Stompel, Daniel; Gielecki, Jerzy S.

    2011-01-01

    Introduction The analysis of the upper massif of the craniofacial (UMC) is widely used in many fields of science. The aim of the study was to create a high resolution computer system based on a digital information record and on vector graphics, that could enable dimension measuring and evaluation of craniofacial shape using the radial method. Material and methods The study was carried out on 184 skulls, in a good state of preservation, from the early middle ages. The examined skulls were fixed into Molisson's craniostat in the author's own modification. They were directed in space towards the Frankfurt plane and photographed in frontal norm with a digital camera. The parameters describing the plane and dimensional structure of the UMC and orbits were obtained thanks to the computer analysis of the function recordings picturing the craniofacial structures and using software combining raster graphics with vector graphics. Results It was compared mean values of both orbits separately for male and female groups. In female skulls the comparison of the left and right side did not show statistically significant differences. In male group, higher values were observed for the right side. Only the circularity index presented higher values for the left side. Conclusions Computer graphics with the software used for analysing digital pictures of UMC and orbits increase the precision of measurements as well as the calculation possibilities. Recognition of the face in the post mortem examination is crucial for those working on identification in anthropology and criminology laboratories. PMID:22291834

  11. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory?Determination of Trihalomethane Formation Potential, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Fram, Miranda S.; Bush, Noel

    2004-01-01

    An analytical method for the determination of the trihalomethane formation potential of water samples has been developed. The trihalomethane formation potential is measured by dosing samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine, and then analyzing the resulting trihalomethanes by purge and trap/gas chromatography equipped with an electron capture detector. Detailed explanations of the method and quality-control practices are provided. Method validation experiments showed that the trihalomethane formation potential varies as a function of time between sample collection and analysis, residual-free chlorine concentration, method of sample dilution, and the concentration of bromide in the sample.

  12. Animal Disease Import Risk Analysis--a Review of Current Methods and Practice.

    PubMed

    Peeler, E J; Reese, R A; Thrush, M A

    2015-10-01

    The application of risk analysis to the spread of disease with international trade in animals and their products, that is, import risk analysis (IRA), has been largely driven by the Sanitary and Phytosanitary (SPS) agreement of the World Trade Organization (WTO). The degree to which the IRA standard established by the World Organization for Animal Health (OIE), and associated guidance, meets the needs of the SPS agreement is discussed. The use of scenario trees is the core modelling approach used to represent the steps necessary for the hazard to occur. There is scope to elaborate scenario trees for commodity IRA so that the quantity of hazard at each step is assessed, which is crucial to the likelihood of establishment. The dependence between exposure and establishment suggests that they should fall within the same subcomponent. IRA undertaken for trade reasons must include an assessment of consequences to meet SPS criteria, but guidance is sparse. The integration of epidemiological and economic modelling may open a path for better methods. Matrices have been used in qualitative IRA to combine estimates of entry and exposure, and consequences with likelihood, but this approach has flaws and better methods are needed. OIE IRA standards and guidance indicate that the volume of trade should be taken into account, but offer no detail. Some published qualitative IRAs have assumed current levels and patterns of trade without specifying the volume of trade, which constrains the use of IRA to determine mitigation measures (to reduce risk to an acceptable level) and whether the principle of equivalence, fundamental to the SPS agreement, has been observed. It is questionable whether qualitative IRA can meet all the criteria set out in the SPS agreement. Nevertheless, scope exists to elaborate the current standards and guidance, so they better serve the principle of science-based decision-making. PMID:24237667

  13. Evaluating the clinical appropriateness of nurses' prescribing practice: method development and findings from an expert panel analysis

    PubMed Central

    Latter, Sue; Maben, Jill; Myall, Michelle; Young, Amanda

    2007-01-01

    Background The number of nurses independently prescribing medicines in England is rising steadily. There had been no attempt systematically to evaluate the clinical appropriateness of nurses' prescribing decisions. Aims (i) To establish a method of assessing the clinical appropriateness of nurses' prescribing decisions; (ii) to evaluate the prescribing decisions of a sample of nurses, using this method. Method A modified version of the Medication Appropriateness Index (MAI) was developed, piloted and subsequently used by seven medical prescribing experts to rate transcripts of 12 nurse prescriber consultations selected from a larger database of 118 audio‐recorded consultations collected as part of a national evaluation. Experts were also able to give written qualitative comments on each of the MAI dimensions applied to each of the consultations. Analysis Experts' ratings were analysed using descriptive statistics. Qualitative comments were subjected to a process of content analysis to identify themes within and across both MAI items and consultations. Results Experts' application of the modified MAI to transcripts of nurse prescriber consultations demonstrated validity and feasibility as a method of assessing the clinical appropriateness of nurses' prescribing decisions. In the majority of assessments made by the expert panel, nurses' prescribing decisions were rated as clinically appropriate on all nine items in the MAI. Conclusion A valid and feasible method of assessing the clinical appropriateness of nurses' prescribing practice has been developed using a modified MAI and transcripts of audio‐recorded consultations sent to a panel of prescribing experts. Prescribing nurses in this study were generally considered to be making clinically appropriate prescribing decisions. This approach to measuring prescribing appropriateness could be used as part of quality assurance in routine practice, as a method of identifying continuing professional development needs

  14. APA's Learning Objectives for Research Methods and Statistics in Practice: A Multimethod Analysis

    ERIC Educational Resources Information Center

    Tomcho, Thomas J.; Rice, Diana; Foels, Rob; Folmsbee, Leah; Vladescu, Jason; Lissman, Rachel; Matulewicz, Ryan; Bopp, Kara

    2009-01-01

    Research methods and statistics courses constitute a core undergraduate psychology requirement. We analyzed course syllabi and faculty self-reported coverage of both research methods and statistics course learning objectives to assess the concordance with APA's learning objectives (American Psychological Association, 2007). We obtained a sample of…

  15. Practical Thermal Evaluation Methods For HAC Fire Analysis In Type B Radiaoactive Material (RAM) Packages

    SciTech Connect

    Abramczyk, Glenn; Hensel, Stephen J; Gupta, Narendra K.

    2013-03-28

    Title 10 of the United States Code of Federal Regulations Part 71 for the Nuclear Regulatory Commission (10 CFR Part 71.73) requires that Type B radioactive material (RAM) packages satisfy certain Hypothetical Accident Conditions (HAC) thermal design requirements to ensure package safety during accidental fire conditions. Compliance with thermal design requirements can be met by prototype tests, analyses only or a combination of tests and analyses. Normally, it is impractical to meet all the HAC using tests only and the analytical methods are too complex due to the multi-physics non-linear nature of the fire event. Therefore, a combination of tests and thermal analyses methods using commercial heat transfer software are used to meet the necessary design requirements. The authors, along with his other colleagues at Savannah River National Laboratory in Aiken, SC, USA, have successfully used this 'tests and analyses' approach in the design and certification of several United States' DOE/NNSA certified packages, e.g. 9975, 9977, 9978, 9979, H1700, and Bulk Tritium Shipping Package (BTSP). This paper will describe these methods and it is hoped that the RAM Type B package designers and analysts can use them for their applications.

  16. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: practical guidelines for the crystalline sponge method

    SciTech Connect

    Ramadhar, Timothy R.; Zheng, Shao-Liang; Chen, Yu-Sheng

    2015-01-01

    This report describes complete practical guidelines and insights for the crystalline sponge method, which have been derived through the first use of synchrotron radiation on these systems, and includes a procedure for faster synthesis of the sponges. These guidelines will be applicable to crystal sponge data collected at synchrotrons or in-house facilities, and will allow researchers to obtain reliable high-quality data and construct chemically and physically sensible models for guest structural determination. A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine

  17. A Critical Analysis of SocINDEX and Sociological Abstracts Using an Evaluation Method for the Practicing Bibliographer

    ERIC Educational Resources Information Center

    Mellone, James T.

    2010-01-01

    This study provides a database evaluation method for the practicing bibliographer that is more than a brief review yet less than a controlled experiment. The author establishes evaluation criteria in the context of the bibliographic instruction provided to meet the research requirements of undergraduate sociology majors at Queens College, City…

  18. Insight into Evaluation Practice: A Content Analysis of Designs and Methods Used in Evaluation Studies Published in North American Evaluation-Focused Journals

    ERIC Educational Resources Information Center

    Christie, Christina A.; Fleischer, Dreolin Nesbitt

    2010-01-01

    To describe the recent practice of evaluation, specifically method and design choices, the authors performed a content analysis on 117 evaluation studies published in eight North American evaluation-focused journals for a 3-year period (2004-2006). The authors chose this time span because it follows the scientifically based research (SBR)…

  19. Qualitative Approaches to Mixed Methods Practice

    ERIC Educational Resources Information Center

    Hesse-Biber, Sharlene

    2010-01-01

    This article discusses how methodological practices can shape and limit how mixed methods is practiced and makes visible the current methodological assumptions embedded in mixed methods practice that can shut down a range of social inquiry. The article argues that there is a "methodological orthodoxy" in how mixed methods is practiced that…

  20. Future methods in pharmacy practice research.

    PubMed

    Almarsdottir, A B; Babar, Z U D

    2016-06-01

    This article describes the current and future practice of pharmacy scenario underpinning and guiding this research and then suggests future directions and strategies for such research. First, it sets the scene by discussing the key drivers which could influence the change in pharmacy practice research. These are demographics, technology and professional standards. Second, deriving from this, it seeks to predict and forecast the future shifts in use of methodologies. Third, new research areas and availability of data impacting on future methods are discussed. These include the impact of aging information technology users on healthcare, understanding and responding to cultural and social disparities, implementing multidisciplinary initiatives to improve health care, medicines optimization and predictive risk analysis, and pharmacy as business and health care institution. Finally, implications of the trends for pharmacy practice research methods are discussed. PMID:27209486

  1. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory-- Determination of Dissolved Organic Carbon in Water by High Temperature Catalytic Oxidation, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Bird, Susan M.; Fram, Miranda S.; Crepeau, Kathryn L.

    2003-01-01

    An analytical method has been developed for the determination of dissolved organic carbon concentration in water samples. This method includes the results of the tests used to validate the method and the quality-control practices used for dissolved organic carbon analysis. Prior to analysis, water samples are filtered to remove suspended particulate matter. A Shimadzu TOC-5000A Total Organic Carbon Analyzer in the nonpurgeable organic carbon mode is used to analyze the samples by high temperature catalytic oxidation. The analysis usually is completed within 48 hours of sample collection. The laboratory reporting level is 0.22 milligrams per liter.

  2. Systematic Review and Meta-Analysis of Studies Evaluating Diagnostic Test Accuracy: A Practical Review for Clinical Researchers-Part II. Statistical Methods of Meta-Analysis.

    PubMed

    Lee, Juneyoung; Kim, Kyung Won; Choi, Sang Hyun; Huh, Jimi; Park, Seong Ho

    2015-01-01

    Meta-analysis of diagnostic test accuracy studies differs from the usual meta-analysis of therapeutic/interventional studies in that, it is required to simultaneously analyze a pair of two outcome measures such as sensitivity and specificity, instead of a single outcome. Since sensitivity and specificity are generally inversely correlated and could be affected by a threshold effect, more sophisticated statistical methods are required for the meta-analysis of diagnostic test accuracy. Hierarchical models including the bivariate model and the hierarchical summary receiver operating characteristic model are increasingly being accepted as standard methods for meta-analysis of diagnostic test accuracy studies. We provide a conceptual review of statistical methods currently used and recommended for meta-analysis of diagnostic test accuracy studies. This article could serve as a methodological reference for those who perform systematic review and meta-analysis of diagnostic test accuracy studies. PMID:26576107

  3. Practical state of health estimation of power batteries based on Delphi method and grey relational grade analysis

    NASA Astrophysics Data System (ADS)

    Sun, Bingxiang; Jiang, Jiuchun; Zheng, Fangdan; Zhao, Wei; Liaw, Bor Yann; Ruan, Haijun; Han, Zhiqiang; Zhang, Weige

    2015-05-01

    The state of health (SOH) estimation is very critical to battery management system to ensure the safety and reliability of EV battery operation. Here, we used a unique hybrid approach to enable complex SOH estimations. The approach hybridizes the Delphi method known for its simplicity and effectiveness in applying weighting factors for complicated decision-making and the grey relational grade analysis (GRGA) for multi-factor optimization. Six critical factors were used in the consideration for SOH estimation: peak power at 30% state-of-charge (SOC), capacity, the voltage drop at 30% SOC with a C/3 pulse, the temperature rises at the end of discharge and charge at 1C; respectively, and the open circuit voltage at the end of charge after 1-h rest. The weighting of these factors for SOH estimation was scored by the 'experts' in the Delphi method, indicating the influencing power of each factor on SOH. The parameters for these factors expressing the battery state variations are optimized by GRGA. Eight battery cells were used to illustrate the principle and methodology to estimate the SOH by this hybrid approach, and the results were compared with those based on capacity and power capability. The contrast among different SOH estimations is discussed.

  4. Practical method for analysis and design of slender reinforced concrete columns subjected to biaxial bending and axial loads

    NASA Astrophysics Data System (ADS)

    Bouzid, T.; Demagh, K.

    2011-03-01

    Reinforced and concrete-encased composite columns of arbitrarily shaped cross sections subjected to biaxial bending and axial loads are commonly used in many structures. For this purpose, an iterative numerical procedure for the strength analysis and design of short and slender reinforced concrete columns with a square cross section under biaxial bending and an axial load by using an EC2 stress-strain model is presented in this paper. The computational procedure takes into account the nonlinear behavior of the materials (i.e., concrete and reinforcing bars) and includes the second - order effects due to the additional eccentricity of the applied axial load by the Moment Magnification Method. The ability of the proposed method and its formulation has been tested by comparing its results with the experimental ones reported by some authors. This comparison has shown that a good degree of agreement and accuracy between the experimental and theoretical results have been obtained. An average ratio (proposed to test) of 1.06 with a deviation of 9% is achieved.

  5. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: Practical guidelines for the crystalline sponge method

    DOE PAGESBeta

    Ramadhar, Timothy R.; Zheng, Shao -Liang; Chen, Yu -Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collectionmore » times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high

  6. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: Practical guidelines for the crystalline sponge method

    SciTech Connect

    Ramadhar, Timothy R.; Zheng, Shao -Liang; Chen, Yu -Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of

  7. Methods of Genomic Competency Integration in Practice

    PubMed Central

    Jenkins, Jean; Calzone, Kathleen A.; Caskey, Sarah; Culp, Stacey; Weiner, Marsha; Badzek, Laurie

    2015-01-01

    Purpose Genomics is increasingly relevant to health care, necessitating support for nurses to incorporate genomic competencies into practice. The primary aim of this project was to develop, implement, and evaluate a year-long genomic education intervention that trained, supported, and supervised institutional administrator and educator champion dyads to increase nursing capacity to integrate genomics through assessments of program satisfaction and institutional achieved outcomes. Design Longitudinal study of 23 Magnet Recognition Program® Hospitals (21 intervention, 2 controls) participating in a 1-year new competency integration effort aimed at increasing genomic nursing competency and overcoming barriers to genomics integration in practice. Methods Champion dyads underwent genomic training consisting of one in-person kick-off training meeting followed by monthly education webinars. Champion dyads designed institution-specific action plans detailing objectives, methods or strategies used to engage and educate nursing staff, timeline for implementation, and outcomes achieved. Action plans focused on a minimum of seven genomic priority areas: champion dyad personal development; practice assessment; policy content assessment; staff knowledge needs assessment; staff development; plans for integration; and anticipated obstacles and challenges. Action plans were updated quarterly, outlining progress made as well as inclusion of new methods or strategies. Progress was validated through virtual site visits with the champion dyads and chief nursing officers. Descriptive data were collected on all strategies or methods utilized, and timeline for achievement. Descriptive data were analyzed using content analysis. Findings The complexity of the competency content and the uniqueness of social systems and infrastructure resulted in a significant variation of champion dyad interventions. Conclusions Nursing champions can facilitate change in genomic nursing capacity through

  8. Doing Conversation Analysis: A Practical Guide.

    ERIC Educational Resources Information Center

    ten Have, Paul

    Noting that conversation analysis (CA) has developed into one of the major methods of analyzing speech in the disciplines of communications, linguistics, anthropology and sociology, this book demonstrates in a practical way how to become a conversation analyst. As well as providing an overall introduction to the approach, it focuses on the…

  9. Intermittent hypoxia training as non-pharmacologic therapy for cardiovascular diseases: Practical analysis on methods and equipment.

    PubMed

    Serebrovskaya, Tatiana V; Xi, Lei

    2016-09-01

    The global industrialization has brought profound lifestyle changes and environmental pollutions leading to higher risks of cardiovascular diseases. Such tremendous challenges outweigh the benefits of major advances in pharmacotherapies (such as statins, antihypertensive, antithrombotic drugs) and exacerbate the public healthcare burdens. One of the promising complementary non-pharmacologic therapies is the so-called intermittent hypoxia training (IHT) via activation of the human body's own natural defense through adaptation to intermittent hypoxia. This review article primarily focuses on the practical questions concerning the utilization of IHT as a non-pharmacologic therapy against cardiovascular diseases in humans. Evidence accumulated in the past five decades of research in healthy men and patients has suggested that short-term daily sessions consisting 3-4 bouts of 5-7 min exposures to 12-10% O2 alternating with normoxic durations for 2-3 weeks can result in remarkable beneficial effects in treatment of cardiovascular diseases such as hypertension, coronary heart disease, and heart failure. Special attentions are paid to the therapeutic effects of different IHT models, along with introduction of a variety of specialized facilities and equipment available for IHT, including hypobaric chambers, hypoxia gas mixture deliver equipment (rooms, tents, face masks), and portable rebreathing devices. Further clinical trials and thorough evaluations of the risks versus benefits of IHT are much needed to develop a series of standardized and practical guidelines for IHT. Taken together, we can envisage a bright future for IHT to play a more significant role in the preventive and complementary medicine against cardiovascular diseases. PMID:27407098

  10. Development and application to clinical practice of a validated HPLC method for the analysis of β-glucocerebrosidase in Gaucher disease.

    PubMed

    Colomer, E Gras; Gómez, M A Martínez; Alvarez, A González; Martí, M Climente; Moreno, P León; Zarzoso, M Fernández; Jiménez-Torres, N V

    2014-03-01

    The main objective of our study is to develop a simple, fast and reliable method for measuring β-glucocerebrosidase activity in Gaucher patients leukocytes in clinical practice. This measurement may be a useful marker to drive dose selection and early clinical decision making of enzyme replacement therapy. We measure the enzyme activity by high-performance liquid chromatography with ultraviolet detection and 4-nitrophenyl-β-d-glucopyranoside as substrate. A cohort of eight Gaucher patients treated with enzyme replacement therapy and ten healthy controls were tested; median enzyme activity values was 20.57mU/ml (interquartile range 19.92-21.53mU/ml) in patients and mean was 24.73mU/ml (24.12-25.34mU/ml) in the reference group, which allowed the establishment of the normal range of β-glucocerebrosidase activity. The proposed method for leukocytes glucocerebrosidase activity measuring is fast, easy to use, inexpensive and reliable. Furthermore, significant differences between both populations were observed (p=0.008). This suggests that discerning between patients and healthy individuals and providing an approach to enzyme dosage optimization is feasible. This method could be considered as a decision support tool for clinical monitoring. Our study is a first approach to in depth analysis of enzyme replacement therapy and optimization of dosing therapies. PMID:24447963

  11. Assessment of management in general practice: validation of a practice visit method.

    PubMed Central

    van den Hombergh, P; Grol, R; van den Hoogen, H J; van den Bosch, W J

    1998-01-01

    BACKGROUND: Practice management (PM) in general practice is as yet ill-defined; a systematic description of its domain, as well as a valid method to assess it, are necessary for research and assessment. AIM: To develop and validate a method to assess PM of general practitioners (GPs) and practices. METHOD: Relevant and potentially discriminating indicators were selected from a systematic framework of 2410 elements of PM to be used in an assessment method (VIP = visit instrument PM). The method was first tested in a pilot study and, after revision, was evaluated in order to select discriminating indicators and to determine validity of dimensions (factor and reliability analysis, linear regression). RESULTS: One hundred and ten GPs were assessed with the practice visit method using 249 indicators; 208 of these discriminated sufficiently at practice level or at GP level. Factor analysis resulted in 34 dimensions and in a taxonomy of PM. Dimensions and indicators showed marked variation between GPs and practices. Training practices scored higher on five dimensions; single-handed and dispensing practices scored lower on delegated tasks, but higher on accessibility and availability. CONCLUSION: A visit method to assess PM has been developed and its validity studied systematically. The taxonomy and dimensions of PM were in line with other classifications. Selection of a balanced number of useful and relevant indicators was nevertheless difficult. The dimensions could discriminate between groups of GPs and practices, establishing the value of the method for assessment. The VIP method could be an important contribution to the introduction of continuous quality improvement in the profession. PMID:10198481

  12. Statistical log analysis made practical

    SciTech Connect

    Mitchell, W.K.; Nelson, R.J. )

    1991-06-01

    This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.

  13. Exergy analysis: Principles and practice

    NASA Astrophysics Data System (ADS)

    Moran, M. J.; Sciubba, E.

    1994-04-01

    The importance of the goal of developing systems that effectively use nonrenewable energy resources such as oil, natural gas, and coal is apparent. The method of exergy analysis is well suited for furthering this goal, for it enables the location, type and true magnitude of waste and loss to be determined. Such information can be used to design new systems and to reduce the inefficiency of existing systems. This paper provides a brief survey of both exergy principles and the current literature of exergy analysis with emphasis on areas of application.

  14. A Method for Optimizing Waste Management and Disposal Practices Using a Group-Based Uncertainty Model for the Analysis of Characterization Data - 13191

    SciTech Connect

    Simpson, A.; Clapham, M.; Lucero, R.; West, J.

    2013-07-01

    It is a universal requirement for characterization of radioactive waste, that the consignor shall calculate and report a Total Measurement Uncertainty (TMU) value associated with each of the measured quantities such as nuclide activity. For Non-destructive Assay systems, the TMU analysis is typically performed on an individual container basis. However, in many cases, the waste consignor treats, transports, stores and disposes of containers in groups for example by over-packing smaller containers into a larger container or emplacing containers into groups for final disposal. The current standard practice for container-group data analysis is usually to treat each container as independent and uncorrelated and use a simple summation / averaging method (or in some cases summation of TMU in quadrature) to define the overall characteristics and associated uncertainty in the container group. In reality, many groups of containers are assayed on the same system, so there will be a large degree of co-dependence in the individual uncertainty elements. Many uncertainty terms may be significantly reduced when addressing issues such as the source position and variability in matrix contents over large populations. The systematic terms encompass both inherently 'two-directional' random effects (e.g. variation of source position) and other terms that are 'one-directional' i.e. designed to account for potential sources of bias. An analysis has been performed with population groups of a variety of non-destructive assay platforms in order to define a quantitative mechanism for waste consignors to determine overall TMU for batches of containers that have been assayed on the same system. (authors)

  15. Evaluation of agricultural best-management practices in the Conestoga River headwaters, Pennsylvania; methods of data collection and analysis and description of study areas

    USGS Publications Warehouse

    Chichester, Douglas C.

    1988-01-01

    The U.S. Geological Survey is conducting a water quality study as part of the nationally implemented Rural Clean Water Program in the headwaters of the Conestoga River, Pennsylvania. The study, which began in 1982, was designed to determine the effect of agricultural best management practices on surface--and groundwater quality. The study was concentrated in four areas within the intensively farmed, carbonate rock terrane located predominately in Lancaster County, Pennsylvania. These areas were divided into three monitoring components: (1) a Regional study area (188 sq mi): (2) a Small Watershed study area (5.82 sq mi); and (3) two field site study areas, Field-Site 1 (22.1 acres) and Field 2 (47.5 acres). The type of water quality data and the methods of data collection and analysis are presented. The monitoring strategy and description of the study areas are discussed. The locations and descriptions for all data collection locations at the four study areas are provided. (USGS)

  16. The Sherlock Holmes method in clinical practice.

    PubMed

    Sopeña, B

    2014-04-01

    This article lists the integral elements of the Sherlock Holmes method, which is based on the intelligent collection of information through detailed observation, careful listening and thorough examination. The information thus obtained is analyzed to develop the main and alternative hypotheses, which are shaped during the deductive process until the key leading to the solution is revealed. The Holmes investigative method applied to clinical practice highlights the advisability of having physicians reason through and seek out the causes of the disease with the data obtained from acute observation, a detailed review of the medical history and careful physical examination. PMID:24457141

  17. Selecting Needs Analysis Methods.

    ERIC Educational Resources Information Center

    Newstrom, John W.; Lilyquist, John M.

    1979-01-01

    Presents a contingency model for decision making with regard to needs analysis methods. Focus is on 12 methods with brief discussion of their defining characteristics and some operational guidelines for their use. (JOW)

  18. Practical Considerations for Using Exploratory Factor Analysis in Educational Research

    ERIC Educational Resources Information Center

    Beavers, Amy S.; Lounsbury, John W.; Richards, Jennifer K.; Huck, Schuyler W.; Skolits, Gary J.; Esquivel, Shelley L.

    2013-01-01

    The uses and methodology of factor analysis are widely debated and discussed, especially the issues of rotational use, methods of confirmatory factor analysis, and adequate sample size. The variety of perspectives and often conflicting opinions can lead to confusion among researchers about best practices for using factor analysis. The focus of the…

  19. Guidance for using mixed methods design in nursing practice research.

    PubMed

    Chiang-Hanisko, Lenny; Newman, David; Dyess, Susan; Piyakong, Duangporn; Liehr, Patricia

    2016-08-01

    The mixed methods approach purposefully combines both quantitative and qualitative techniques, enabling a multi-faceted understanding of nursing phenomena. The purpose of this article is to introduce three mixed methods designs (parallel; sequential; conversion) and highlight interpretive processes that occur with the synthesis of qualitative and quantitative findings. Real world examples of research studies conducted by the authors will demonstrate the processes leading to the merger of data. The examples include: research questions; data collection procedures and analysis with a focus on synthesizing findings. Based on experience with mixed methods studied, the authors introduce two synthesis patterns (complementary; contrasting), considering application for practice and implications for research. PMID:27397810

  20. Practical approaches for design and analysis of clinical trials of infertility treatments: crossover designs and the Mantel-Haenszel method are recommended.

    PubMed

    Takada, Michihiro; Sozu, Takashi; Sato, Tosiya

    2015-01-01

    Crossover designs have some advantages over standard clinical trial designs and they are often used in trials evaluating the efficacy of treatments for infertility. However, clinical trials of infertility treatments violate a fundamental condition of crossover designs, because women who become pregnant in the first treatment period are not treated in the second period. In previous research, to deal with this problem, some new designs, such as re-randomization designs, and analysis methods including the logistic mixture model and the beta-binomial mixture model were proposed. Although the performance of these designs and methods has previously been evaluated in large-scale clinical trials with sample sizes of more than 1000 per group, the actual sample sizes of infertility treatment trials are usually around 100 per group. The most appropriate design and analysis for these moderate-scale clinical trials are currently unclear. In this study, we conducted simulation studies to determine the appropriate design and analysis method of moderate-scale clinical trials for irreversible endpoints by evaluating the statistical power and bias in the treatment effect estimates. The Mantel-Haenszel method had similar power and bias to the logistic mixture model. The crossover designs had the highest power and the smallest bias. We recommend using a combination of the crossover design and the Mantel-Haenszel method for two-period, two-treatment clinical trials with irreversible endpoints. PMID:25776032

  1. An Online Forum As a Qualitative Research Method: Practical Issues

    PubMed Central

    Im, Eun-Ok; Chee, Wonshik

    2008-01-01

    Background Despite positive aspects of online forums as a qualitative research method, very little is known about practical issues involved in using online forums for data collection, especially for a qualitative research project. Objectives The purpose of this paper is to describe the practical issues that the researchers encountered in implementing an online forum as a qualitative component of a larger study on cancer pain experience. Method Throughout the study process, the research staff recorded issues ranged from minor technical problems to serious ethical dilemmas as they arose and wrote memos about them. The memos and written records of discussions were reviewed and analyzed using the content analysis suggested by Weber. Results Two practical issues related to credibility were identified: a high response and retention rate and automatic transcripts. An issue related to dependability was the participants’ easy forgetfulness. The issues related to confirmability were difficulties in theoretical saturation and unstandardized computer and Internet jargon. A security issue related to hacking attempts was noted as well. Discussion The analysis of these issues suggests several implications for future researchers who want to use online forums as a qualitative data collection method. PMID:16849979

  2. Comprehensive rotorcraft analysis methods

    NASA Technical Reports Server (NTRS)

    Stephens, Wendell B.; Austin, Edward E.

    1988-01-01

    The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).

  3. System based practice: a concept analysis

    PubMed Central

    YAZDANI, SHAHRAM; HOSSEINI, FAKHROLSADAT; AHMADY, SOLEIMAN

    2016-01-01

    Introduction Systems-Based Practice (SBP) is one of the six competencies introduced by the ACGME for physicians to provide high quality of care and also the most challenging of them in performance, training, and evaluation of medical students. This concept analysis clarifies the concept of SBP by identifying its components to make it possible to differentiate it from other similar concepts. For proper training of SBP and to ensure these competencies in physicians, it is necessary to have an operational definition, and SBP’s components must be precisely defined in order to provide valid and reliable assessment tools. Methods Walker & Avant’s approach to concept analysis was performed in eight stages: choosing a concept, determining the purpose of analysis, identifying all uses of the concept, defining attributes, identifying a model case, identifying borderline, related, and contrary cases, identifying antecedents and consequences, and defining empirical referents. Results Based on the analysis undertaken, the attributes of SBP includes knowledge of the system, balanced decision between patients’ need and system goals, effective role playing in interprofessional health care team, system level of health advocacy, and acting for system improvement. System thinking and a functional system are antecedents and system goals are consequences. A case model, as well as border, and contrary cases of SBP, has been introduced. Conclusion he identification of SBP attributes in this study contributes to the body of knowledge in SBP and reduces the ambiguity of this concept to make it possible for applying it in training of different medical specialties. Also, it would be possible to develop and use more precise tools to evaluate SBP competency by using empirical referents of the analysis. PMID:27104198

  4. Achieving Integration in Mixed Methods Designs—Principles and Practices

    PubMed Central

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-01-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs—exploratory sequential, explanatory sequential, and convergent—and through four advanced frameworks—multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

  5. A collection of research reporting, theoretical analysis, and practical applications in science education: Examining qualitative research methods, action research, educator-researcher partnerships, and constructivist learning theory

    NASA Astrophysics Data System (ADS)

    Hartle, R. Todd

    2007-12-01

    Educator-researcher partnerships are increasingly being used to improve the teaching of science. Chapter 1 provides a summary of the literature concerning partnerships, and examines the justification of qualitative methods in studying these relationships. It also justifies the use of Participatory Action Research (PAR). Empirically-based studies of educator-researcher partnership relationships are rare despite investments in their implementation by the National Science Foundation (NSF) and others. Chapter 2 describes a qualitative research project in which participants in an NSF GK-12 fellowship program were studied using informal observations, focus groups, personal interviews, and journals to identify and characterize the cultural factors that influenced the relationships between the educators and researchers. These factors were organized into ten critical axes encompassing a range of attitudes, behaviors, or values defined by two stereotypical extremes. These axes were: (1) Task Dictates Context vs. Context Dictates Task; (2) Introspection vs. Extroversion; (3) Internal vs. External Source of Success; (4) Prior Planning vs. Implementation Flexibility; (5) Flexible vs. Rigid Time Sense; (6) Focused Time vs. Multi-tasking; (7) Specific Details vs. General Ideas; (8) Critical Feedback vs. Encouragement; (9) Short Procedural vs. Long Content Repetition; and (10) Methods vs. Outcomes are Well Defined. Another ten important stereotypical characteristics, which did not fit the structure of an axis, were identified and characterized. The educator stereotypes were: (1) Rapport/Empathy; (2) Like Kids; (3) People Management; (4) Communication Skills; and (5) Entertaining. The researcher stereotypes were: (1) Community Collaboration; (2) Focus Intensity; (3) Persistent; (4) Pattern Seekers; and (5) Curiosity/Skeptical. Chapter 3 summarizes the research presented in chapter 2 into a practical guide for participants and administrators of educator-researcher partnerships

  6. Systemic accident analysis: examining the gap between research and practice.

    PubMed

    Underwood, Peter; Waterson, Patrick

    2013-06-01

    The systems approach is arguably the dominant concept within accident analysis research. Viewing accidents as a result of uncontrolled system interactions, it forms the theoretical basis of various systemic accident analysis (SAA) models and methods. Despite the proposed benefits of SAA, such as an improved description of accident causation, evidence within the scientific literature suggests that these techniques are not being used in practice and that a research-practice gap exists. The aim of this study was to explore the issues stemming from research and practice which could hinder the awareness, adoption and usage of SAA. To achieve this, semi-structured interviews were conducted with 42 safety experts from ten countries and a variety of industries, including rail, aviation and maritime. This study suggests that the research-practice gap should be closed and efforts to bridge the gap should focus on ensuring that systemic methods meet the needs of practitioners and improving the communication of SAA research. PMID:23542136

  7. Discourse analysis in general practice: a sociolinguistic approach.

    PubMed

    Nessa, J; Malterud, K

    1990-06-01

    It is a simple but important fact that as general practitioners we talk to our patients. The quality of the conversation is of vital importance for the outcome of the consultation. The purpose of this article is to discuss a methodological tool borrowed from sociolinguistics--discourse analysis. To assess the suitability of this method for analysis of general practice consultations, the authors have performed a discourse analysis of one single consultation. Our experiences are presented here. PMID:2369986

  8. Description of practice as an ambulatory care nurse: psychometric properties of a practice-analysis survey.

    PubMed

    Baghi, Heibatollah; Panniers, Teresa L; Smolenski, Mary C

    2007-01-01

    Changes within nursing demand that a specialty conduct periodic, appropriate practice analyses to continually validate itself against preset standards. This study explicates practice analysis methods using ambulatory care nursing as an exemplar. Data derived from a focus group technique were used to develop a survey that was completed by 499 ambulatory care nurses. The validity of the instrument was assessed using principal components analysis; reliability was estimated using Cronbach's alpha coefficient. The focus group with ambulatory care experts produced 34 knowledge and activity statements delineating ambulatory care nursing practice. The survey data produced five factors accounting for 71% of variance in the data. The factors were identified as initial patient assessment, professional nursing issues and standards, client care management skills, technical/clinical skills, and system administrative operations. It was concluded that practice analyses delineate a specialty and provide input for certification examinations aimed at measuring excellence in a field of nursing. PMID:17665821

  9. Science Teaching Methods: A Rationale for Practices

    ERIC Educational Resources Information Center

    Osborne, Jonathan

    2011-01-01

    This article is a version of the talk given by Jonathan Osborne as the Association for Science Education (ASE) invited lecturer at the National Science Teachers' Association Annual Convention in San Francisco, USA, in April 2011. The article provides an explanatory justification for teaching about the practices of science in school science that…

  10. Method of analysis at the U.S. Geological Survey California Water Science Center, Sacramento Laboratory - determination of haloacetic acid formation potential, method validation, and quality-control practices

    USGS Publications Warehouse

    Zazzi, Barbara C.; Crepeau, Kathryn L.; Fram, Miranda S.; Bergamaschi, Brian A.

    2005-01-01

    An analytical method for the determination of haloacetic acid formation potential of water samples has been developed by the U.S. Geological Survey California Water Science Center Sacramento Laboratory. The haloacetic acid formation potential is measured by dosing water samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine. The haloacetic acids formed are bromochloroacetic acid, bromodichloroacetic acid, dibromochloroacetic acid, dibromoacetic acid, dichloroacetic acid, monobromoacetic acid, monochloroacetic acid, tribromoacetic acid, and trichloroacetic acid. They are extracted, methylated, and then analyzed using a gas chromatograph equipped with an electron capture detector. Method validation experiments were performed to determine the method accuracy, precision, and detection limit for each of the compounds. Method detection limits for these nine haloacetic acids ranged from 0.11 to 0.45 microgram per liter. Quality-control practices include the use of blanks, quality-control samples, calibration verification standards, surrogate recovery, internal standard, matrix spikes, and duplicates.

  11. Method of analysis and quality-assurance practices by the U. S. Geological Survey Organic Geochemistry Research Group; determination of four selected mosquito insecticides and a synergist in water using liquid-liquid extraction and gas chrom

    USGS Publications Warehouse

    Zimmerman, L.R.; Strahan, A.P.; Thurman, E.M.

    2001-01-01

    A method of analysis and quality-assurance practices were developed for the determination of four mosquito insecticides (malathion, metho-prene, phenothrin, and resmethrin) and one synergist (piperonyl butoxide) in water. The analytical method uses liquid-liquid extraction (LLE) and gas chromatography/mass spectrometry (GC/MS). Good precision and accuracy were demonstrated in reagent water, urban surface water, and ground water. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 50 nanograms per liter ranged from 68 to 171 percent, with standard deviations in concentrations of 27 nanograms per liter or less. The method detection limit for all compounds was 5.9 nanograms per liter or less for 247-milliliter samples. This method is valuable for acquiring information about the fate and transport of these mosquito insecticides and one synergist in water.

  12. Generalized Multicoincidence Analysis Methods

    SciTech Connect

    Warren, Glen A.; Smith, Leon E.; Aalseth, Craig E.; Ellis, J. E.; Valsan, Andrei B.; Mengesha, Wondwosen

    2005-10-01

    The ability to conduct automated trace radionuclide analysis at or near the sample collection point would provide a valuable tool for emergency response, nuclear forensics and environmental monitoring. Pacific Northwest National Laboratory is developing systems for this purpose based on dual gamma-ray spectrometers, e.g. NaI(TI) or HPGe, combined with thin organic scintillator sensors to detect light charged particles. Translating the coincident signatures recorded by these systems, which include , and , into the concentration of detectable radionuclides in the sample requires generalized multicoincidence analysis tools. The development and validation of the Coincidence Lookup Library, which currently contains the probabilities of single and coincidence signatures from more than 420 isotopes, is described. Also discussed is a method to calculate the probability of observing a coincidence signature which incorporates true coincidence summing effects. These effects are particularly important for high-geometric-efficiency detection systems. Finally, a process for validating the integrated analysis software package is demonstrated using GEANT 4 simulations of the prototype detector systems.

  13. Generalized Multicoincidence Analysis Methods

    SciTech Connect

    Warren, Glen A.; Smith, Leon E.; Aalseth, Craig E.; Ellis, J. E.; Valsan, Andrei B.; Mengesha, Wondwosen

    2006-02-01

    The ability to conduct automated trace radionuclide analysis at or near the sample collection point would provide a valuable tool for emergency response, environmental monitoring, and verification of treaties and agreements. Pacific Northwest National Laboratory is developing systems for this purpose based on dual gamma-ray spectrometers, e.g. NaI(TI) or HPGe, combined with thin organic scintillator sensors to detect light charged particles. Translating the coincident signatures recorded by these systems, which include beta-gamma, gamma-gamma and beta-gamma-gamma, into the concentration of detectable radionuclides in the sample requires generalized multicoincidence analysis tools. The development and validation of the Coincidence Lookup Library, which currently contains the probabilities of single and coincidence signatures from more than 420 isotopes, is described. Also discussed is a method to calculate the probability of observing a coincidence signature which incorporates true coincidence summing effects. These effects are particularly important for high-geometric-efficiency detection systems. Finally, a process for verifying the integrated analysis software package is demonstrated using GEANT 4 simulations of the prototype detector systems.

  14. Exploratory and Confirmatory Analysis of the Trauma Practices Questionnaire

    ERIC Educational Resources Information Center

    Craig, Carlton D.; Sprang, Ginny

    2009-01-01

    Objective: The present study provides psychometric data for the Trauma Practices Questionnaire (TPQ). Method: A nationally randomized sample of 2,400 surveys was sent to self-identified trauma treatment specialists, and 711 (29.6%) were returned. Results: An exploratory factor analysis (N = 319) conducted on a randomly split sample (RSS) revealed…

  15. [The analysis of the medication error, in practice].

    PubMed

    Didelot, Nicolas; Cistio, Céline

    2016-01-01

    By performing a systemic analysis of medication errors which occur in practice, the multidisciplinary teams can avoid a reoccurrence with the aid of an improvement action plan. The methods must take into account all the factors which might have contributed to or favoured the occurrence of a medication incident or accident. PMID:27177485

  16. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1929-01-01

    This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

  17. Council on Certification Professional Practice Analysis.

    PubMed

    Zaglaniczny, K L

    1993-06-01

    The CCNA has completed a PPA and will begin implementing its recommendations with the December 1993 certification examination. The results of the PPA provide content validation for the CCNA certification examination. The certification examination is reflective of the knowledge and skill required for entry-level practice. Assessment of this knowledge is accomplished through the use of questions that are based on the areas represented in the content outline. Analysis of the PPA has resulted in changes in the examination content outline and percentages of questions in each area to reflect current entry-level nurse anesthesia practice. The new outline is based on the major domains of knowledge required for nurse anesthesia practice. These changes are justified by the consistency in the responses of the practitioners surveyed. There was overall agreement as to the knowledge and skills related to patient conditions, procedures, agents, techniques, and equipment that an entry-level CRNA must have to practice. Members of the CCNA and Examination Committee will use the revised outline to develop questions for the certification examination. The questions will be focused on the areas identified as requiring high levels of expertise and those that appeared higher in frequency. The PPA survey will be used as a basis for subsequent content validation studies. It will be revised to reflect new knowledge, technology, and techniques related to nurse anesthesia practice. The CCNA has demonstrated its commitment to the certification process through completion of the PPA and implementation of changes in the structure of the examination. PMID:8291387

  18. A practical method for sensor absolute calibration.

    PubMed

    Meisenholder, G W

    1966-04-01

    This paper describes a method of performing sensor calibrations using an NBS standard of spectral irradiance. The method shown, among others, was used for calibration of the Mariner IV Canopus sensor. Agreement of inflight response to preflight calibrations performed by this technique has been found to be well within 10%. PMID:20048890

  19. Scenistic Methods for Training: Applications and Practice

    ERIC Educational Resources Information Center

    Lyons, Paul R.

    2011-01-01

    Purpose: This paper aims to complement an earlier article (2010) in "Journal of European Industrial Training" in which the description and theory bases of scenistic methods were presented. This paper also offers a description of scenistic methods and information on theory bases. However, the main thrust of this paper is to describe, give suggested…

  20. Using Developmental Evaluation Methods with Communities of Practice

    ERIC Educational Resources Information Center

    van Winkelen, Christine

    2016-01-01

    Purpose: This paper aims to explore the use of developmental evaluation methods with community of practice programmes experiencing change or transition to better understand how to target support resources. Design/methodology/approach: The practical use of a number of developmental evaluation methods was explored in three organizations over a…

  1. Teachers' Beliefs and Technology Practices: A Mixed-Methods Approach

    ERIC Educational Resources Information Center

    Palak, Deniz; Walls, Richard T.

    2009-01-01

    In a sequential mixed methods design, we sought to examine the relationship between teachers' beliefs and their instructional technology practices among technology-using teachers who worked at technology-rich schools to ultimately describe if change in practice toward a student-centered paradigm occurred. The integrated mixed-methods results…

  2. Airphoto analysis of erosion control practices

    NASA Technical Reports Server (NTRS)

    Morgan, K. M.; Morris-Jones, D. R.; Lee, G. B.; Kiefer, R. W.

    1980-01-01

    The Universal Soil Loss Equation (USLE) is a widely accepted tool for erosion prediction and conservation planning. In this study, airphoto analysis of color and color infrared 70 mm photography at a scale of 1:60,000 was used to determine the erosion control practice factor in the USLE. Information about contour tillage, contour strip cropping, and grass waterways was obtained from aerial photography for Pheasant Branch Creek watershed in Dane County, Wisconsin.

  3. Practice-Focused Ethnographies of Higher Education: Method/ological Corollaries of a Social Practice Perspective

    ERIC Educational Resources Information Center

    Trowler, Paul Richard

    2014-01-01

    Social practice theory addresses both theoretical and method/ological agendas. To date priority has been given to the former, with writing on the latter tending often to be an afterthought to theoretical expositions or fieldwork accounts. This article gives sustained attention to the method/ological corollaries of a social practice perspective. It…

  4. A Practical Guide to Immunoassay Method Validation

    PubMed Central

    Andreasson, Ulf; Perret-Liaudet, Armand; van Waalwijk van Doorn, Linda J. C.; Blennow, Kaj; Chiasserini, Davide; Engelborghs, Sebastiaan; Fladby, Tormod; Genc, Sermin; Kruse, Niels; Kuiperij, H. Bea; Kulic, Luka; Lewczuk, Piotr; Mollenhauer, Brit; Mroczko, Barbara; Parnetti, Lucilla; Vanmechelen, Eugeen; Verbeek, Marcel M.; Winblad, Bengt; Zetterberg, Henrik; Koel-Simmelink, Marleen; Teunissen, Charlotte E.

    2015-01-01

    Biochemical markers have a central position in the diagnosis and management of patients in clinical medicine, and also in clinical research and drug development, also for brain disorders, such as Alzheimer’s disease. The enzyme-linked immunosorbent assay (ELISA) is frequently used for measurement of low-abundance biomarkers. However, the quality of ELISA methods varies, which may introduce both systematic and random errors. This urges the need for more rigorous control of assay performance, regardless of its use in a research setting, in clinical routine, or drug development. The aim of a method validation is to present objective evidence that a method fulfills the requirements for its intended use. Although much has been published on which parameters to investigate in a method validation, less is available on a detailed level on how to perform the corresponding experiments. To remedy this, standard operating procedures (SOPs) with step-by-step instructions for a number of different validation parameters is included in the present work together with a validation report template, which allow for a well-ordered presentation of the results. Even though the SOPs were developed with the intended use for immunochemical methods and to be used for multicenter evaluations, most of them are generic and can be used for other technologies as well. PMID:26347708

  5. ALFRED: A Practical Method for Alignment-Free Distance Computation.

    PubMed

    Thankachan, Sharma V; Chockalingam, Sriram P; Liu, Yongchao; Apostolico, Alberto; Aluru, Srinivas

    2016-06-01

    Alignment-free approaches are gaining persistent interest in many sequence analysis applications such as phylogenetic inference and metagenomic classification/clustering, especially for large-scale sequence datasets. Besides the widely used k-mer methods, the average common substring (ACS) approach has emerged to be one of the well-known alignment-free approaches. Two recent works further generalize this ACS approach by allowing a bounded number k of mismatches in the common substrings, relying on approximation (linear time) and exact computation, respectively. Albeit having a good worst-case time complexity [Formula: see text], the exact approach is complex and unlikely to be efficient in practice. Herein, we present ALFRED, an alignment-free distance computation method, which solves the generalized common substring search problem via exact computation. Compared to the theoretical approach, our algorithm is easier to implement and more practical to use, while still providing highly competitive theoretical performances with an expected run-time of [Formula: see text]. By applying our program to phylogenetic inference as a case study, we find that our program facilitates to exactly reconstruct the topology of the reference phylogenetic tree for a set of 27 primate mitochondrial genomes, at reasonably acceptable speed. ALFRED is implemented in C++ programming language and the source code is freely available online. PMID:27138275

  6. Pragmatism in practice: mixed methods research for physiotherapy.

    PubMed

    Shaw, James A; Connelly, Denise M; Zecevic, Aleksandra A

    2010-11-01

    The purpose of this paper is to provide an argument for the place of mixed methods research across practice settings as an effective means of supporting evidence-based practice in physiotherapy. Physiotherapy practitioners use both qualitative and quantitative methods throughout the process of patient care-from history taking, assessment, and intervention to evaluation of outcomes. Research on practice paradigms demonstrates the importance of mixing qualitative and quantitative methods to achieve 'expert practice' that is concerned with optimizing outcomes and incorporating patient beliefs and values. Research paradigms that relate to this model of practice would integrate qualitative and quantitative types of knowledge and inquiry, while maintaining a prioritized focus on patient outcomes. Pragmatism is an emerging research paradigm where practical consequences and the effects of concepts and behaviors are vital components of meaning and truth. This research paradigm supports the simultaneous use of qualitative and quantitative methods of inquiry to generate evidence to support best practice. This paper demonstrates that mixed methods research with a pragmatist view provides evidence that embraces and addresses the multiple practice concerns of practitioners better than either qualitative or quantitative research approaches in isolation. PMID:20649500

  7. Practical aspects of spatially high accurate methods

    NASA Technical Reports Server (NTRS)

    Godfrey, Andrew G.; Mitchell, Curtis R.; Walters, Robert W.

    1992-01-01

    The computational qualities of high order spatially accurate methods for the finite volume solution of the Euler equations are presented. Two dimensional essentially non-oscillatory (ENO), k-exact, and 'dimension by dimension' ENO reconstruction operators are discussed and compared in terms of reconstruction and solution accuracy, computational cost and oscillatory behavior in supersonic flows with shocks. Inherent steady state convergence difficulties are demonstrated for adaptive stencil algorithms. An exact solution to the heat equation is used to determine reconstruction error, and the computational intensity is reflected in operation counts. Standard MUSCL differencing is included for comparison. Numerical experiments presented include the Ringleb flow for numerical accuracy and a shock reflection problem. A vortex-shock interaction demonstrates the ability of the ENO scheme to excel in simulating unsteady high-frequency flow physics.

  8. Practice and Evaluation of Blended Learning with Cross-Cultural Distance Learning in a Foreign Language Class: Using Mix Methods Data Analysis

    ERIC Educational Resources Information Center

    Sugie, Satoko; Mitsugi, Makoto

    2014-01-01

    The Information and Communication Technology (ICT) utilization in Chinese as a "second" foreign language has mainly been focused on Learning Management System (LMS), digital material development, and quantitative analysis of learners' grammatical knowledge. There has been little research that has analyzed the effectiveness of…

  9. Presence in nursing practice: a concept analysis.

    PubMed

    Hessel, Joy A

    2009-01-01

    Presence is an elusive concept in nursing practice that has been recognized as advantageous in the patient experience. Dictionary sources define presence as being with and attending to another; involvement, companionship. Nursing scholars and theorists have elaborated on the dictionary definition of presence to include a holistic definition inclusive of the patient experience and the connection experienced between both patient and provider. However, despite attempts to define presence as it relates to nursing practice, a definition that completely encompasses the substantial benefits on the patient experience is yet to be developed. As guided by Walker and Avant, this concept analysis was performed by selection of a concept, determination of the purpose of the analysis, evaluation of existing definitions, identification of defining attributes of the concept, formulation of patient cases that epitomize and contrast the concept, and identification of antecedents and empirical referents of the concept. Thus, in this concept analysis article, existing definitions of presence will be recognized and evaluated, cases demonstrating nursing presence explored, and a definition of presence in nursing developed. PMID:19713785

  10. Practice-Near and Practice-Distant Methods in Human Services Research

    ERIC Educational Resources Information Center

    Froggett, Lynn; Briggs, Stephen

    2012-01-01

    This article discusses practice-near research in human services, a cluster of methodologies that may include thick description, intensive reflexivity, and the study of emotional and relational processes. Such methods aim to get as near as possible to experiences at the relational interface between institutions and the practice field.…

  11. Reflections on Experiential Teaching Methods: Linking the Classroom to Practice

    ERIC Educational Resources Information Center

    Wehbi, Samantha

    2011-01-01

    This article explores the use of experiential teaching methods in social work education. The literature demonstrates that relying on experiential teaching methods in the classroom can have overwhelmingly positive learning outcomes; however, not much is known about the possible effect of these classroom methods on practice. On the basis of…

  12. Communication Network Analysis Methods.

    ERIC Educational Resources Information Center

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  13. Complementing Gender Analysis Methods.

    PubMed

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital. PMID:25941756

  14. Development of a method to analyze orthopaedic practice expenses.

    PubMed

    Brinker, M R; Pierce, P; Siegel, G

    2000-03-01

    The purpose of the current investigation was to present a standard method by which an orthopaedic practice can analyze its practice expenses. To accomplish this, a five-step process was developed to analyze practice expenses using a modified version of activity-based costing. In this method, general ledger expenses were assigned to 17 activities that encompass all the tasks and processes typically performed in an orthopaedic practice. These 17 activities were identified in a practice expense study conducted for the American Academy of Orthopaedic Surgeons. To calculate the cost of each activity, financial data were used from a group of 19 orthopaedic surgeons in Houston, Texas. The activities that consumed the largest portion of the employee work force (person hours) were service patients in office (25.0% of all person hours), maintain medical records (13.6% of all person hours), and resolve collection disputes and rebill charges (12.3% of all person hours). The activities that comprised the largest portion of the total expenses were maintain facility (21.4%), service patients in office (16.0%), and sustain business by managing and coordinating practice (13.8%). The five-step process of analyzing practice expenses was relatively easy to perform and it may be used reliably by most orthopaedic practices. PMID:10738440

  15. Methods of analysis and quality-assurance practices of the U.S. Geological Survey organic laboratory, Sacramento, California; determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Domagalski, Joseph L.; Kuivila, Kathryn M.

    1994-01-01

    Analytical method and quality-assurance practices were developed for a study of the fate and transport of pesticides in the Sacramento-San Joaquin Delta and the Sacramento and San Joaquin River. Water samples were filtered to remove suspended parti- culate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide, and the pesticides were eluted with three 2-milliliter aliquots of hexane:diethyl ether (1:1). The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for analytes determined per 1,500-milliliter samples ranged from 0.006 to 0.047 microgram per liter. Recoveries ranged from 47 to 89 percent for 12 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.05 and 0.26 microgram per liter. The method was modified to improve the pesticide recovery by reducing the sample volume to 1,000 milliliters. Internal standards were added to improve quantitative precision and accuracy. The analysis also was expanded to include a total of 21 pesticides. The method detection limits for 1,000-milliliter samples ranged from 0.022 to 0.129 microgram per liter. Recoveries ranged from 38 to 128 percent for 21 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.10 and 0.75 microgram per liter.

  16. Optimizing Distributed Practice: Theoretical Analysis and Practical Implications

    ERIC Educational Resources Information Center

    Cepeda, Nicholas J.; Coburn, Noriko; Rohrer, Doug; Wixted, John T.; Mozer, Michael C,; Pashler, Harold

    2009-01-01

    More than a century of research shows that increasing the gap between study episodes using the same material can enhance retention, yet little is known about how this so-called distributed practice effect unfolds over nontrivial periods. In two three-session laboratory studies, we examined the effects of gap on retention of foreign vocabulary,…

  17. Practical Issues in Component Aging Analysis

    SciTech Connect

    Dana L. Kelly; Andrei Rodionov; Jens Uwe-Klugel

    2008-09-01

    This paper examines practical issues in the statistical analysis of component aging data. These issues center on the stochastic process chosen to model component failures. The two stochastic processes examined are repair same as new, leading to a renewal process, and repair same as old, leading to a nonhomogeneous Poisson process. Under the first assumption, times between failures can treated as statistically independent observations from a stationary process. The common distribution of the times between failures is called the renewal distribution. Under the second process, the times between failures will not be independently and identically distributed, and one cannot simply fit a renewal distribution to the cumulative failure times or the times between failures. The paper illustrates how the assumption made regarding the repair process is crucial to the analysis. Besides the choice of stochastic process, other issues that are discussed include qualitative graphical analysis and simple nonparametric hypothesis tests to help judge which process appears more appropriate. Numerical examples are presented to illustrate the issues discussed in the paper.

  18. Learning to Teach within Practice-Based Methods Courses

    ERIC Educational Resources Information Center

    Kazemi, Elham; Waege, Kjersti

    2015-01-01

    Supporting prospective teachers to enact high quality instruction requires transforming their methods preparation. This study follows three teachers through a practice-based elementary methods course. Weekly class sessions took place in an elementary school. The setting afforded opportunities for prospective teachers to engage in cycles of…

  19. Multi-criteria decision analysis: Limitations, pitfalls, and practical difficulties

    SciTech Connect

    Kujawski, Edouard

    2003-02-01

    The 2002 Winter Olympics women's figure skating competition is used as a case study to illustrate some of the limitations, pitfalls, and practical difficulties of Multi-Criteria Decision Analysis (MCDA). The paper compares several widely used models for synthesizing the multiple attributes into a single aggregate value. The various MCDA models can provide conflicting rankings of the alternatives for a common set of information even under states of certainty. Analysts involved in MCDA need to deal with the following challenging tasks: (1) selecting an appropriate analysis method, and (2) properly interpreting the results. An additional trap is the availability of software tools that implement specific MCDA models that can beguile the user with quantitative scores. These conclusions are independent of the decision domain and they should help foster better MCDA practices in many fields including systems engineering trade studies.

  20. Breath analysis: translation into clinical practice.

    PubMed

    Brodrick, Emma; Davies, Antony; Neill, Paul; Hanna, Louise; Williams, E Mark

    2015-06-01

    Breath analysis in respiratory disease is a non-invasive technique which has the potential to complement or replace current screening and diagnostic techniques without inconvenience or harm to the patient. Recent advances in ion mobility spectrometry (IMS) have allowed exhaled breath to be analysed rapidly, reliably and robustly thereby facilitating larger studies of exhaled breath profiles in clinical environments. Preliminary studies have demonstrated that volatile organic compound (VOC) breath profiles of people with respiratory disease can be distinguished from healthy control groups but there is a need to validate, standardise and ensure comparability between laboratories before real-time breath analysis becomes a clinical reality. It is also important that breath sampling procedures and methodologies are developed in conjunction with clinicians and the practicalities of working within the clinical setting are considered to allow the full diagnostic potential of these techniques to be realised. A protocol is presented, which has been developed over three years and successfully deployed for quickly and accurately collecting breath samples from 323 respiratory patients recruited from 10 different secondary health care clinics. PMID:25971863

  1. Farmers' Preferences for Methods of Receiving Information on New or Innovative Farming Practices.

    ERIC Educational Resources Information Center

    Riesenberg, Lou E.; Gor, Christopher Obel

    1989-01-01

    Survey of 386 Idaho farmers (response rate 58 percent) identified preferred methods of receiving information on new or innovative farming practices. Analysis revealed preference for interpersonal methods (demonstrations, tours, and field trips) over mass media such as computer-assisted instruction (CAI) and home study, although younger farmers,…

  2. Traditional Methods for Mineral Analysis

    NASA Astrophysics Data System (ADS)

    Ward, Robert E.; Carpenter, Charles E.

    This chapter describes traditional methods for analysis of minerals involving titrimetric and colorimetric procedures, and the use of ion selective electrodes. Other traditional methods of mineral analysis include gravimetric titration (i.e., insoluble forms of minerals are precipitated, rinse, dried, and weighed) and redox reactions (i.e., mineral is part of an oxidation-reduction reaction, and product is quantitated). However, these latter two methods will not be covered because they currently are used little in the food industry. The traditional methods that will be described have maintained widespread usage in the food industry despite the development of more modern instrumentation such as atomic absorption spectroscopy and inductively coupled plasma-atomic emission spectroscopy (Chap. 24). Traditional methods generally require chemicals and equipment that are routinely available in an analytical laboratory and are within the experience of most laboratory technicians. Additionally, traditional methods often form the basis for rapid analysis kits (e.g., Quantab®; for salt determination) that are increasingly in demand. Procedures for analysis of minerals of major nutritional or food processing concern are used for illustrative purposes. For additional examples of traditional methods refer to references (1-6). Slight modifications of these traditional methods are often needed for specific foodstuffs to minimize interferences or to be in the range of analytical performance. For analytical requirements for specific foods see the Official Methods of Analysis of AOAC International (5) and related official methods (6).

  3. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  4. Toward a practical approach for ergodicity analysis

    NASA Astrophysics Data System (ADS)

    Wang, H.; Wang, C.; Zhao, Y.; Lin, X.; Yu, C.

    2015-09-01

    It is of importance to perform hydrological forecast using a finite hydrological time series. Most time series analysis approaches presume a data series to be ergodic without justifying this assumption. This paper presents a practical approach to analyze the mean ergodic property of hydrological processes by means of autocorrelation function evaluation and Augmented Dickey Fuller test, a radial basis function neural network, and the definition of mean ergodicity. The mean ergodicity of precipitation processes at the Lanzhou Rain Gauge Station in the Yellow River basin, the Ankang Rain Gauge Station in Han River, both in China, and at Newberry, MI, USA are analyzed using the proposed approach. The results indicate that the precipitations of March, July, and August in Lanzhou, and of May, June, and August in Ankang have mean ergodicity, whereas, the precipitation of any other calendar month in these two rain gauge stations do not have mean ergodicity. The precipitation of February, May, July, and December in Newberry show ergodic property, although the precipitation of each month shows a clear increasing or decreasing trend.

  5. Practical Teaching Methods K-6: Sparking the Flame of Learning.

    ERIC Educational Resources Information Center

    Wilkinson, Pamela Fannin.; McNutt, Margaret A.; Friedman, Esther S.

    This book provides state-of-the-art teaching practices and methods, discussing the elements of good teaching in the content areas and including examples from real classrooms and library media centers. Chapters offer reflection exercises, assessment tips specific to each curriculum, and resource lists. Nine chapters examine: (1) "The Premise"…

  6. Retrieval practice can eliminate list method directed forgetting.

    PubMed

    Abel, Magdalena; Bäuml, Karl-Heinz T

    2016-01-01

    It has recently been shown that retrieval practice can reduce memories' susceptibility to interference, like retroactive and proactive interference. In this study, we therefore examined whether retrieval practice can also reduce list method directed forgetting, a form of intentional forgetting that presupposes interference. In each of two experiments, subjects successively studied two lists of items. After studying each single list, subjects restudied the list items to enhance learning, or they were asked to recall the items. Following restudy or retrieval practice of list 1 items, subjects were cued to either forget the list or remember it for an upcoming final test. Experiment 1 employed a free-recall and Experiment 1 a cued-recall procedure on the final memory test. In both experiments, directed forgetting was present in the restudy condition but was absent in the retrieval-practice condition, indicating that retrieval practice can reduce or even eliminate this form of forgetting. The results are consistent with the view that retrieval practice enhances list segregation processes. Such processes may reduce interference between lists and thus reduce directed forgetting. PMID:26286882

  7. Practicing the practice: Learning to guide elementary science discussions in a practice-oriented science methods course

    NASA Astrophysics Data System (ADS)

    Shah, Ashima Mathur

    University methods courses are often criticized for telling pre-service teachers, or interns, about the theories behind teaching instead of preparing them to actually enact teaching. Shifting teacher education to be more "practice-oriented," or to focus more explicitly on the work of teaching, is a current trend for re-designing the way we prepare teachers. This dissertation addresses the current need for research that unpacks the shift to more practice-oriented approaches by studying the content and pedagogical approaches in a practice-oriented, masters-level elementary science methods course (n=42 interns). The course focused on preparing interns to guide science classroom discussions. Qualitative data, such as video records of course activities and interns' written reflections, were collected across eight course sessions. Codes were applied at the sentence and paragraph level and then grouped into themes. Five content themes were identified: foregrounding student ideas and questions, steering discussion toward intended learning goals, supporting students to do the cognitive work, enacting teacher role of facilitator, and creating a classroom culture for science discussions. Three pedagogical approach themes were identified. First, the teacher educators created images of science discussions by modeling and showing videos of this practice. They also provided focused teaching experiences by helping interns practice the interactive aspects of teaching both in the methods classroom and with smaller groups of elementary students in schools. Finally, they structured the planning and debriefing phases of teaching so interns could learn from their teaching experiences and prepare well for future experiences. The findings were analyzed through the lens of Grossman and colleagues' framework for teaching practice (2009) to reveal how the pedagogical approaches decomposed, represented, and approximated practice throughout course activities. Also, the teacher educators

  8. Genre Analysis, ESP and Professional Practice

    ERIC Educational Resources Information Center

    Bhatia, Vijay K.

    2008-01-01

    Studies of professional genres and professional practices are invariably seen as complementing each other, in that they not only influence each other but are often co-constructed in specific professional contexts. However, professional genres have often been analyzed in isolation, leaving the study of professional practice almost completely out,…

  9. Compassion fatigue within nursing practice: a concept analysis.

    PubMed

    Coetzee, Siedine Knobloch; Klopper, Hester C

    2010-06-01

    "Compassion fatigue" was first introduced in relation to the study of burnout among nurses, but it was never defined within this context; it has since been adopted as a synonym for secondary traumatic stress disorder, which is far removed from the original meaning of the term. The aim of the study was to define compassion fatigue within nursing practice. The method that was used in this article was concept analysis. The findings revealed several categories of compassion fatigue: risk factors, causes, process, and manifestations. The characteristics of each of these categories are specified and a connotative (theoretical) definition, model case, additional cases, empirical indicators, and a denotative (operational) definition are provided. Compassion fatigue progresses from a state of compassion discomfort to compassion stress and, finally, to compassion fatigue, which if not effaced in its early stages of compassion discomfort or compassion stress, can permanently alter the compassionate ability of the nurse. Recommendations for nursing practice, education, and research are discussed. PMID:20602697

  10. Model-Based Practice Analysis and Test Specifications.

    ERIC Educational Resources Information Center

    Kane, Michael

    1997-01-01

    Licensure and certification decisions are usually based on a chain of inference from results of a practice analysis to test specifications, the test, examinee performance, and a pass-fail decision. This article focuses on the design of practice analyses and translation of practice analyses results into test specifications. (SLD)

  11. Practical postcalibration uncertainty analysis: Yucca Mountain, Nevada.

    PubMed

    James, Scott C; Doherty, John E; Eddebbarh, Al-Aziz

    2009-01-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is "calibrated." Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the United States' proposed site for disposal of high-level radioactive waste. Linear and nonlinear uncertainty analyses are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction's uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This article applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. PMID:19744249

  12. Methods for genetic linkage analysis using trisomies

    SciTech Connect

    Feingold, E.; Lamb, N.E.; Sherman, S.L.

    1995-02-01

    Certain genetic disorders are rare in the general population, but more common in individuals with specific trisomies. Examples of this include leukemia and duodenal atresia in trisomy 21. This paper presents a linkage analysis method for using trisomic individuals to map genes for such traits. It is based on a very general gene-specific dosage model that posits that the trait is caused by specific effects of different alleles at one or a few loci and that duplicate copies of {open_quotes}susceptibility{close_quotes} alleles inherited from the nondisjoining parent give increased likelihood of having the trait. Our mapping method is similar to identity-by-descent-based mapping methods using affected relative pairs and also to methods for mapping recessive traits using inbred individuals by looking for markers with greater than expected homozygosity by descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected homozygosity in the chromosomes inherited from the nondisjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the trait gene, a confidence interval for that distance, and methods for computing power and sample sizes. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers and how to test candidate genes. 20 refs., 5 figs., 1 tab.

  13. Practical methods to improve the development of computational software

    SciTech Connect

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-07-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  14. Practical Nursing. Ohio's Competency Analysis Profile.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Vocational Instructional Materials Lab.

    Developed through a modified DACUM (Developing a Curriculum) process involving business, industry, labor, and community agency representatives in Ohio, this document is a comprehensive and verified employer competency profile for practical nursing. The list contains units (with and without subunits), competencies, and competency builders that…

  15. [Embryo vitrification: French clinical practice analysis for BLEFCO].

    PubMed

    Hesters, L; Achour-Frydman, N; Mandelbaum, J; Levy, R

    2013-09-01

    Frozen thawed embryo transfer is currently an important part of present-day assisted reproductive technology (ART) aiming at increasing the clinical pregnancy rate per oocyte retrieval. Although slow freezing method has been the reference during 2 decades, the recent years witnessed an expansion of ultrarapid cryopreservation method named vitrification. Recently in France, vitrification has been authorized for cryopreserving human embryos. Therefore BLEFCO consortium decides to perform a descriptive study through questionnaires to evaluate the state of vitrification in the French clinical practice. Questionnaires were addressed to the 105 French centres of reproductive biology and 60 were fully completed. Data analysis revealed that embryo survival rate as well as, clinical pregnancy rate were increased after vitrification technology when compared to slow freezing procedure. Overall, these preliminary data suggest that vitrification may improve ART outcomes through an increasing of the cumulative pregnancy rate per oocyte retrieval. PMID:23962680

  16. Method of multivariate spectral analysis

    DOEpatents

    Keenan, Michael R.; Kotula, Paul G.

    2004-01-06

    A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).

  17. Practical analysis of welding processes using finite element analysis.

    SciTech Connect

    Cowles, J. H.; Dave, V. R.; Hartman, D. A.

    2001-01-01

    With advances in commercially available finite element software and computational capability, engineers can now model large-scale problems in mechanics, heat transfer, fluid flow, and electromagnetics as never before. With these enhancements in capability, it is increasingly tempting to include the fundamental process physics to help achieve greater accuracy (Refs. 1-7). While this goal is laudable, it adds complication and drives up cost and computational requirements. Practical analysis of welding relies on simplified user inputs to derive important relativistic trends in desired outputs such as residual stress or distortion due to changes in inputs like voltage, current, and travel speed. Welding is a complex three-dimensional phenomenon. The question becomes how much modeling detail is needed to accurately predict relative trends in distortion, residual stress, or weld cracking? In this work, a HAZ (Heat Affected Zone) weld-cracking problem was analyzed to rank two different welding cycles (weld speed varied) in terms of crack susceptibility. Figure 1 shows an aerospace casting GTA welded to a wrought skirt. The essentials of part geometry, welding process, and tooling were suitably captured lo model the strain excursion in the HAZ over a crack-susceptible temperature range, and the weld cycles were suitably ranked. The main contribution of this work is the demonstration of a practical methodology by which engineering solutions to engineering problems may be obtained through weld modeling when time and resources are extremely limited. Typically, welding analysis suffers with the following unknowns: material properties over entire temperature range, the heat-input source term, and environmental effects. Material properties of interest are conductivity, specific heat, latent heat, modulus, Poisson's ratio, yield strength, ultimate strength, and possible rate dependencies. Boundary conditions are conduction into fixturing, radiation and convection to the

  18. Method of photon spectral analysis

    DOEpatents

    Gehrke, Robert J.; Putnam, Marie H.; Killian, E. Wayne; Helmer, Richard G.; Kynaston, Ronnie L.; Goodwin, Scott G.; Johnson, Larry O.

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  19. Method of photon spectral analysis

    DOEpatents

    Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

    1993-04-27

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

  20. Semen analysis: its place in modern reproductive medical practice.

    PubMed

    McLachlan, Robert I; Baker, H W Gordon; Clarke, Gary N; Harrison, Keith L; Matson, Phillip L; Holden, Carol A; de Kretser, David M

    2003-02-01

    Semen analysis is the most important laboratory investigation for men when assessing the infertile couple. Advances in in vitro fertilisation (IVF) techniques, particularly intracytoplasmic sperm injection (ICSI) involving the direct injection of a single spermatozoon into an egg, have not diminished the role of semen analysis in modern reproductive practice. Semen analysis is the most basic laboratory investigation undertaken and is descriptive in terms of semen volume, appearance, viscosity, sperm concentration, sperm motility and morphology. Since the results are used by clinicians to choose appropriate treatment options, a reliable service is imperative. It is crucial that the laboratory is experienced in the performance of semen analyses to ensure an accurate result. To ensure a quality semen analysis service, laboratories must participate in internal and external quality assurance activities, incorporate rigorous training protocols for technical staff and use reliable procedures. The World Health Organization laboratory manual for the examination of human semen and sperm cervical mucous interaction, clearly describes the variables that need to be assessed and the methods of analysis and quality assurance to be used. PMID:12701680

  1. A practical method to evaluate radiofrequency exposure of mast workers.

    PubMed

    Alanko, Tommi; Hietanen, Maila

    2008-01-01

    Assessment of occupational exposure to radiofrequency (RF) fields in telecommunication transmitter masts is a challenging task. For conventional field strength measurements using manually operated instruments, it is difficult to document the locations of measurements while climbing up a mast. Logging RF dosemeters worn by the workers, on the other hand, do not give any information about the location of the exposure. In this study, a practical method was developed and applied to assess mast workers' exposure to RF fields and the corresponding location. This method uses a logging dosemeter for personal RF exposure evaluation and two logging barometers to determine the corresponding height of the worker's position on the mast. The procedure is not intended to be used for compliance assessments, but to indicate locations where stricter assessments are needed. The applicability of the method is demonstrated by making measurements in a TV and radio transmitting mast. PMID:19054796

  2. Landscape analysis: Theoretical considerations and practical needs

    USGS Publications Warehouse

    Godfrey, A.E.; Cleaves, E.T.

    1991-01-01

    Numerous systems of land classification have been proposed. Most have led directly to or have been driven by an author's philosophy of earth-forming processes. However, the practical need of classifying land for planning and management purposes requires that a system lead to predictions of the results of management activities. We propose a landscape classification system composed of 11 units, from realm (a continental mass) to feature (a splash impression). The classification concerns physical aspects rather than economic or social factors; and aims to merge land inventory with dynamic processes. Landscape units are organized using a hierarchical system so that information may be assembled and communicated at different levels of scale and abstraction. Our classification uses a geomorphic systems approach that emphasizes the geologic-geomorphic attributes of the units. Realm, major division, province, and section are formulated by subdividing large units into smaller ones. For the larger units we have followed Fenneman's delineations, which are well established in the North American literature. Areas and districts are aggregated into regions and regions into sections. Units smaller than areas have, in practice, been subdivided into zones and smaller units if required. We developed the theoretical framework embodied in this classification from practical applications aimed at land use planning and land management in Maryland (eastern Piedmont Province near Baltimore) and Utah (eastern Uinta Mountains). ?? 1991 Springer-Verlag New York Inc.

  3. Landscape analysis: Theoretical considerations and practical needs

    NASA Astrophysics Data System (ADS)

    Godfrey, Andrew E.; Cleaves, Emery T.

    1991-03-01

    Numerous systems of land classification have been proposed. Most have led directly to or have been driven by an author's philosophy of earth-forming processes. However, the practical need of classifying land for planning and management purposes requires that a system lead to predictions of the results of management activities. We propose a landscape classification system composed of 11 units, from realm (a continental mass) to feature (a splash impression). The classification concerns physical aspects rather than economic or social factors; and aims to merge land inventory with dynamic processes. Landscape units are organized using a hierarchical system so that information may be assembled and communicated at different levels of scale and abstraction. Our classification uses a geomorphic systems approach that emphasizes the geologic-geomorphic attributes of the units. Realm, major division, province, and section are formulated by subdividing large units into smaller ones. For the larger units we have followed Fenneman's delineations, which are well established in the North American literature. Areas and districts are aggregated into regions and regions into sections. Units smaller than areas have, in practice, been subdivided into zones and smaller units if required. We developed the theoretical framework embodied in this classification from practical applications aimed at land use planning and land management in Maryland (eastern Piedmont Province near Baltimore) and Utah (eastern Uinta Mountains).

  4. Monte Carlo methods in genetic analysis

    SciTech Connect

    Lin, Shili

    1996-12-31

    Many genetic analyses require computation of probabilities and likelihoods of pedigree data. With more and more genetic marker data deriving from new DNA technologies becoming available to researchers, exact computations are often formidable with standard statistical methods and computational algorithms. The desire to utilize as much available data as possible, coupled with complexities of realistic genetic models, push traditional approaches to their limits. These methods encounter severe methodological and computational challenges, even with the aid of advanced computing technology. Monte Carlo methods are therefore increasingly being explored as practical techniques for estimating these probabilities and likelihoods. This paper reviews the basic elements of the Markov chain Monte Carlo method and the method of sequential imputation, with an emphasis upon their applicability to genetic analysis. Three areas of applications are presented to demonstrate the versatility of Markov chain Monte Carlo for different types of genetic problems. A multilocus linkage analysis example is also presented to illustrate the sequential imputation method. Finally, important statistical issues of Markov chain Monte Carlo and sequential imputation, some of which are unique to genetic data, are discussed, and current solutions are outlined. 72 refs.

  5. Development of a practical costing method for hospitals.

    PubMed

    Cao, Pengyu; Toyabe, Shin-Ichi; Akazawa, Kouhei

    2006-03-01

    To realize an effective cost control, a practical and accurate cost accounting system is indispensable in hospitals. In traditional cost accounting systems, the volume-based costing (VBC) is the most popular cost accounting method. In this method, the indirect costs are allocated to each cost object (services or units of a hospital) using a single indicator named a cost driver (e.g., Labor hours, revenues or the number of patients). However, this method often results in rough and inaccurate results. The activity based costing (ABC) method introduced in the mid 1990s can prove more accurate results. With the ABC method, all events or transactions that cause costs are recognized as "activities", and a specific cost driver is prepared for each activity. Finally, the costs of activities are allocated to cost objects by the corresponding cost driver. However, it is much more complex and costly than other traditional cost accounting methods because the data collection for cost drivers is not always easy. In this study, we developed a simplified ABC (S-ABC) costing method to reduce the workload of ABC costing by reducing the number of cost drivers used in the ABC method. Using the S-ABC method, we estimated the cost of the laboratory tests, and as a result, similarly accurate results were obtained with the ABC method (largest difference was 2.64%). Simultaneously, this new method reduces the seven cost drivers used in the ABC method to four. Moreover, we performed an evaluation using other sample data from physiological laboratory department to certify the effectiveness of this new method. In conclusion, the S-ABC method provides two advantages in comparison to the VBC and ABC methods: (1) it can obtain accurate results, and (2) it is simpler to perform. Once we reduce the number of cost drivers by applying the proposed S-ABC method to the data for the ABC method, we can easily perform the cost accounting using few cost drivers after the second round of costing. PMID

  6. Gaussian Weighted Trajectory Method. IV. No Rainbow Effect in Practice

    NASA Astrophysics Data System (ADS)

    Bonnet, L.

    2009-04-01

    The Gaussian weighted trajectory method (GWTM) is a practical implementation of classical S matrix theory (CSMT) in the random phase approximation, CSMT being the first and simplest semi-classical approach of molecular collisions, developped in the early seventies. Though very close in spirit to the purely classical description, GWTM accounts to some extent for the quantization of the different degrees-of-freedom involved in the processes. While CSMT may give diverging final state distributions, in relation to the rainbow effect of elastic scattering theory, GWTM has never led to such a mathematical catastrophe. The goal of the present note is to explain this finding.

  7. Methods for genetic linkage analysis using trisomies

    SciTech Connect

    Feingold, E.; Lamb, N.E.; Sherman, S.L.

    1994-09-01

    Certain genetic disorders (e.g. congenital cataracts, duodenal atresia) are rare in the general population, but more common in people with Down`s syndrome. We present a method for using individuals with trisomy 21 to map genes for such traits. Our methods are analogous to methods for mapping autosomal dominant traits using affected relative pairs by looking for markers with greater than expected identity-by-descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected reduction to homozygosity in the chromosomes inherited form the non-disjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the gene, a confidence interval for that distance, and methods for computing power and sample sizes. The methods are described in the context of gene-dosage model for the etiology of the disorder, but can be extended to other models. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers, how to test candidate genes, and how to handle the effect of reduced recombination associated with maternal meiosis I non-disjunction.

  8. Effective methods for disseminating research findings to nurses in practice.

    PubMed

    Cronenwett, L R

    1995-09-01

    Professionals in all disciplines are challenged by the proliferation of new knowledge. Nurses, too, must find cost-effective ways of ensuring that their patients are benefiting from the most current knowledge about health and illness. The methods of research dissemination to clinicians described in this article are presumed to be effective because of anecdotal reports, conference evaluations, or clinician surveys. The profession needs more sophisticated evaluations of the effectiveness of various dissemination methods. In the meantime, whether you are a researcher, an administrator, an educator, or a clinician, you have a role to play in improving research dissemination. Implement just one strategy from this article and evaluate the results. Each contribution moves nursing toward research-based practice. PMID:7567569

  9. Voltammetric analysis apparatus and method

    DOEpatents

    Almon, A.C.

    1993-06-08

    An apparatus and method is described for electrochemical analysis of elements in solution. An auxiliary electrode, a reference electrode, and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and the auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between the auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in the sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  10. Voltametric analysis apparatus and method

    DOEpatents

    Almon, Amy C.

    1993-01-01

    An apparatus and method for electrochemical analysis of elements in solution. An auxiliary electrode 14, a reference electrode 18, and five working electrodes 20, 22, 26, 28, and 30 are positioned in a container 12 containing a sample solution 34. The working electrodes are spaced apart evenly from each other and auxiliary electrode 14 to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between auxiliary electrode 14 and each of the working electrodes 20, 22, 26, 28, and 30. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in sample solution 34 and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  11. Translational Behavior Analysis and Practical Benefits

    ERIC Educational Resources Information Center

    Pilgrim, Carol

    2011-01-01

    In his article, Critchfield ("Translational Contributions of the Experimental Analysis of Behavior," "The Behavior Analyst," v34, p3-17, 2011) summarizes a previous call (Mace & Critchfield, 2010) for basic scientists to reexamine the inspiration for their research and turn increasingly to translational approaches. Interestingly, rather than…

  12. Comparative Lifecycle Energy Analysis: Theory and Practice.

    ERIC Educational Resources Information Center

    Morris, Jeffrey; Canzoneri, Diana

    1992-01-01

    Explores the position that more energy is conserved through recycling secondary materials than is generated from municipal solid waste incineration. Discusses one component of a lifecycle analysis--a comparison of energy requirements for manufacturing competing products. Includes methodological issues, energy cost estimates, and difficulties…

  13. The 1999 IDEA Regulations: A Practical Analysis.

    ERIC Educational Resources Information Center

    Borreca, Christopher P.; Goldman, Teri B.; Horton, Janet L.; Mehfoud, Kathleen; Rodick, Bennett; Weatherly, Julie J.; Wenkart, Ronald D.; Wynn, Deryl W.

    This publication explains how some of the more significant 1999 regulations of the Individuals with Disabilities Education Act will affect schools providing services to children in need of special education. The analysis tracks the regulatory format used to organize the rules under Title 34 of the Code of Federal Regulations. The selected…

  14. Correlation method of electrocardiogram analysis

    NASA Astrophysics Data System (ADS)

    Strinadko, Marina M.; Timochko, Katerina B.

    2002-02-01

    The electrocardiograph method is the informational source for functional heart state characteristics. The electrocardiogram parameters are the integrated map of many component characteristics of the heart system and depend on disturbance requirements of each device. In the research work the attempt of making the skeleton diagram of perturbation of the heart system is made by the characteristic description of its basic components and connections between them through transition functions, which are written down by the differential equations of the first and second order with the purpose to build-up and analyze electrocardiogram. Noting the vector character of perturbation and the various position of heart in each organism, we offer own coordinate system connected with heart. The comparative analysis of electrocardiogram was conducted with the usage of correlation method.

  15. Comparison of four teaching methods on Evidence-based Practice skills of postgraduate nursing students.

    PubMed

    Fernandez, Ritin S; Tran, Duong Thuy; Ramjan, Lucie; Ho, Carey; Gill, Betty

    2014-01-01

    The aim of this study was to compare four teaching methods on the evidence-based practice knowledge and skills of postgraduate nursing students. Students enrolled in the Evidence-based Nursing (EBN) unit in Australia and Hong Kong in 2010 and 2011 received education via either the standard distance teaching method, computer laboratory teaching method, Evidence-based Practice-Digital Video Disc (EBP-DVD) teaching method or the didactic classroom teaching method. Evidence-based Practice (EBP) knowledge and skills were evaluated using student assignments that comprised validated instruments. One-way analysis of covariance was implemented to assess group differences on outcomes after controlling for the effects of age and grade point average (GPA). Data were obtained from 187 students. The crude mean score among students receiving the standard+DVD method of instruction was higher for developing a precise clinical question (8.1±0.8) and identifying the level of evidence (4.6±0.7) compared to those receiving other teaching methods. These differences were statistically significant after controlling for age and grade point average. Significant improvement in cognitive and technical EBP skills can be achieved for postgraduate nursing students by integrating a DVD as part of the EBP teaching resources. The EBP-DVD is an easy teaching method to improve student learning outcomes and ensure that external students receive equivalent and quality learning experiences. PMID:23107585

  16. Progress testing: critical analysis and suggested practices.

    PubMed

    Albanese, Mark; Case, Susan M

    2016-03-01

    Educators have long lamented the tendency of students to engage in rote memorization in preparation for tests rather than engaging in deep learning where they attempt to gain meaning from their studies. Rote memorization driven by objective exams has been termed a steering effect. Progress testing (PT), in which a comprehensive examination sampling all of medicine is administered repeatedly throughout the entire curriculum, was developed with the stated aim of breaking the steering effect of examinations and of promoting deep learning. PT is an approach historically linked to problem-based learning (PBL) although there is a growing recognition of its applicability more broadly. The purpose of this article is to summarize the salient features of PT drawn from the literature, provide a critical review of these features based upon the same literature and psychometric considerations drawn from the Standards for Educational and Psychological Testing and provide considerations of what should be part of best practices in applying PT from an evidence-based and a psychometric perspective. PMID:25662873

  17. SAR/QSAR methods in public health practice

    SciTech Connect

    Demchuk, Eugene Ruiz, Patricia; Chou, Selene; Fowler, Bruce A.

    2011-07-15

    Methods of (Quantitative) Structure-Activity Relationship ((Q)SAR) modeling play an important and active role in ATSDR programs in support of the Agency mission to protect human populations from exposure to environmental contaminants. They are used for cross-chemical extrapolation to complement the traditional toxicological approach when chemical-specific information is unavailable. SAR and QSAR methods are used to investigate adverse health effects and exposure levels, bioavailability, and pharmacokinetic properties of hazardous chemical compounds. They are applied as a part of an integrated systematic approach in the development of Health Guidance Values (HGVs), such as ATSDR Minimal Risk Levels, which are used to protect populations exposed to toxic chemicals at hazardous waste sites. (Q)SAR analyses are incorporated into ATSDR documents (such as the toxicological profiles and chemical-specific health consultations) to support environmental health assessments, prioritization of environmental chemical hazards, and to improve study design, when filling the priority data needs (PDNs) as mandated by Congress, in instances when experimental information is insufficient. These cases are illustrated by several examples, which explain how ATSDR applies (Q)SAR methods in public health practice.

  18. A practical approach for linearity assessment of calibration curves under the International Union of Pure and Applied Chemistry (IUPAC) guidelines for an in-house validation of method of analysis.

    PubMed

    Sanagi, M Marsin; Nasir, Zalilah; Ling, Susie Lu; Hermawan, Dadan; Ibrahim, Wan Aini Wan; Naim, Ahmedy Abu

    2010-01-01

    Linearity assessment as required in method validation has always been subject to different interpretations and definitions by various guidelines and protocols. However, there are very limited applicable implementation procedures that can be followed by a laboratory chemist in assessing linearity. Thus, this work proposes a simple method for linearity assessment in method validation by a regression analysis that covers experimental design, estimation of the parameters, outlier treatment, and evaluation of the assumptions according to the International Union of Pure and Applied Chemistry guidelines. The suitability of this procedure was demonstrated by its application to an in-house validation for the determination of plasticizers in plastic food packaging by GC. PMID:20922968

  19. Assessing methods for measurement of clinical outcomes and quality of care in primary care practices

    PubMed Central

    2012-01-01

    Purpose To evaluate the appropriateness of potential data sources for the population of performance indicators for primary care (PC) practices. Methods This project was a cross sectional study of 7 multidisciplinary primary care teams in Ontario, Canada. Practices were recruited and 5-7 physicians per practice agreed to participate in the study. Patients of participating physicians (20-30) were recruited sequentially as they presented to attend a visit. Data collection included patient, provider and practice surveys, chart abstraction and linkage to administrative data sets. Matched pairs analysis was used to examine the differences in the observed results for each indicator obtained using multiple data sources. Results Seven teams, 41 physicians, 94 associated staff and 998 patients were recruited. The survey response rate was 81% for patients, 93% for physicians and 83% for associated staff. Chart audits were successfully completed on all but 1 patient and linkage to administrative data was successful for all subjects. There were significant differences noted between the data collection methods for many measures. No single method of data collection was best for all outcomes. For most measures of technical quality of care chart audit was the most accurate method of data collection. Patient surveys were more accurate for immunizations, chronic disease advice/information dispensed, some general health promotion items and possibly for medication use. Administrative data appears useful for indicators including chronic disease diagnosis and osteoporosis/ breast screening. Conclusions Multiple data collection methods are required for a comprehensive assessment of performance in primary care practices. The choice of which methods are best for any one particular study or quality improvement initiative requires careful consideration of the biases that each method might introduce into the results. In this study, both patients and providers were willing to participate in and

  20. Methods for Cancer Epigenome Analysis

    PubMed Central

    Nagarajan, Raman P.; Fouse, Shaun D.; Bell, Robert J.A.; Costello, Joseph F.

    2014-01-01

    Accurate detection of epimutations in tumor cells is crucial for understanding the molecular pathogenesis of cancer. Alterations in DNA methylation in cancer are functionally important and clinically relevant, but even this well-studied area is continually re-evaluated in light of unanticipated results, including a strong connection between aberrant DNA methylation in adult tumors and polycomb group profiles in embryonic stem cells, cancer-associated genetic mutations in epigenetic regulators such as DNMT3A and TET family genes, and the discovery of abundant 5-hydroxymethylcytosine, a product of TET proteins acting on 5-methylcytosine, in human tissues. The abundance and distribution of covalent histone modifications in primary cancer tissues relative to normal cells is a largely uncharted area, although there is good evidence for a mechanistic role of cancer-specific alterations in epigenetic marks in tumor etiology, drug response and tumor progression. Meanwhile, the discovery of new epigenetic marks continues, and there are many useful methods for epigenome analysis applicable to primary tumor samples, in addition to cancer cell lines. For DNA methylation and hydroxymethylation, next-generation sequencing allows increasingly inexpensive and quantitative whole-genome profiling. Similarly, the refinement and maturation of chromatin immunoprecipitation with next-generation sequencing (ChIP-seq) has made possible genome-wide mapping of histone modifications, open chromatin and transcription factor binding sites. Computational tools have been developed apace with these epigenome methods to better enable the accuracy and interpretation of the data from the profiling methods. PMID:22956508

  1. Practical application of fault tree analysis

    SciTech Connect

    Prugh, R.W.

    1980-01-01

    A detailed survey of standard and novel approaches to Fault Tree construction, based on recent developments at Du Pont, covers the effect-to-cause procedure for control systems as in process plants; the effect-to-cause procedure for processes; source-of-hazard analysis, as in pressure vessel rupture; use of the ''fire triangle'' in a Fault Tree; critical combinations of safeguard failures; action points for automatic or operator control of a process; situations involving hazardous reactant ratios; failure-initiating and failure-enabling events and intervention by the operator; ''daisy-chain'' hazards, e.g., in batch processes and ship accidents; combining batch and continuous operations in a Fault Tree; possible future structure-development procedures for fault-tree construction; and the use of quantitative results (calculated frequencies of Top-Event occurrence) to restructure the Fault Tree after improving the process to any acceptable risk level.

  2. Flow analysis system and method

    NASA Technical Reports Server (NTRS)

    Hill, Wayne S. (Inventor); Barck, Bruce N. (Inventor)

    1998-01-01

    A non-invasive flow analysis system and method wherein a sensor, such as an acoustic sensor, is coupled to a conduit for transmitting a signal which varies depending on the characteristics of the flow in the conduit. The signal is amplified and there is a filter, responsive to the sensor signal, and tuned to pass a narrow band of frequencies proximate the resonant frequency of the sensor. A demodulator generates an amplitude envelope of the filtered signal and a number of flow indicator quantities are calculated based on variations in amplitude of the amplitude envelope. A neural network, or its equivalent, is then used to determine the flow rate of the flow in the conduit based on the flow indicator quantities.

  3. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  4. Reducing alcohol consumption. Comparing three brief methods in family practice.

    PubMed Central

    McIntosh, M. C.; Leigh, G.; Baldwin, N. J.; Marmulak, J.

    1997-01-01

    OBJECTIVE: To compare the effects of three brief methods of reducing alcohol consumption among family practice patients. DESIGN: Patients randomly assigned to one of three interventions were assessed initially and at 3-, 6-, and 12-month follow-up appointments. SETTING: Family practice clinic composed of 12 primary care physicians seeing approximately 6000 adults monthly in a small urban community, population 40,000. PARTICIPANTS: Through a screening questionnaire, 134 men and 131 women were identified as hazardous drinkers (five or more drinks at least once monthly) during an 11-month screening of 1420 patients. Of 265 patients approached, 180 agreed to participate and 159 (83 men and 76 women) actually participated in the study. INTERVENTIONS: Three interventions were studied: brief physician advice (5 minutes), two 30-minute sessions with a physician using cognitive behavioural strategies or two 30-minute sessions with a nurse practitioner using identical strategies. MAIN OUTCOME MEASURES: Quantity and frequency (QF) of drinking were used to assess reduction in hazardous drinking and problems related to drinking over 12 months of follow up. RESULTS: No statistical difference between groups was found. The QF of monthly drinking was reduced overall by 66% (among men) and 74% (among women) for those reporting at least one hazardous drinking day weekly at assessment (N = 96). Men reported drinking significantly more than women. CONCLUSIONS: These results indicated that offering brief, specific advice can motivate patients to reduce their alcohol intake. There was no difference in effect between brief advice from their own physician or brief intervention by a physician or a nurse. PMID:9386883

  5. Trends in sensitivity analysis practice in the last decade.

    PubMed

    Ferretti, Federico; Saltelli, Andrea; Tarantola, Stefano

    2016-10-15

    The majority of published sensitivity analyses (SAs) are either local or one factor-at-a-time (OAT) analyses, relying on unjustified assumptions of model linearity and additivity. Global approaches to sensitivity analyses (GSA) which would obviate these shortcomings, are applied by a minority of researchers. By reviewing the academic literature on SA, we here present a bibliometric analysis of the trends of different SA practices in last decade. The review has been conducted both on some top ranking journals (Nature and Science) and through an extended analysis in the Elsevier's Scopus database of scientific publications. After correcting for the global growth in publications, the amount of papers performing a generic SA has notably increased over the last decade. Even if OAT is still the most largely used technique in SA, there is a clear increase in the use of GSA with preference respectively for regression and variance-based techniques. Even after adjusting for the growth of publications in the sole modelling field, to which SA and GSA normally apply, the trend is confirmed. Data about regions of origin and discipline are also briefly discussed. The results above are confirmed when zooming on the sole articles published in chemical modelling, a field historically proficient in the use of SA methods. PMID:26934843

  6. Diagnostic Methods for Bile Acid Malabsorption in Clinical Practice

    PubMed Central

    Vijayvargiya, Priya; Camilleri, Michael; Shin, Andrea; Saenger, Amy

    2013-01-01

    Altered bile acid (BA) concentrations in the colon may cause diarrhea or constipation. BA malabsorption (BAM) accounts for >25% of patients with irritable bowel syndrome (IBS) with diarrhea and chronic diarrhea in Western countries. As BAM is increasingly recognized, proper diagnostic methods are desired in clinical practice to help direct the most effective treatment course for the chronic bowel dysfunction. This review appraises the methodology, advantages and disadvantages of 4 tools that directly measure BAM: 14C-glycocholate breath and stool test, 75Selenium HomotauroCholic Acid Test (SeHCAT), 7 α-hydroxy-4-cholesten-3-one (C4) and fecal BAs. 14C-glycocholate is a laborious test no longer widely utilized. 75SeHCAT is validated, but not available in the United States. Serum C4 is a simple, accurate method that is applicable to a majority of patients, but requires further clinical validation. Fecal measurements to quantify total and individual fecal BAs are technically cumbersome and not widely available. Regrettably, none of these tests are routinely available in the U.S., and a therapeutic trial with a BA binder is used as a surrogate for diagnosis of BAM. Recent data suggest there is an advantage to studying fecal excretion of the individual BAs and their role in BAM; this may constitute a significant advantage of the fecal BA method over the other tests. Fecal BA test could become a routine addition to fecal fat measurement in patients with unexplained diarrhea. In summary, availability determines the choice of test among C4, SeHCAT and fecal BA; more widespread availability of such tests would enhance clinical management of these patients. PMID:23644387

  7. Extended morphological processing: a practical method for automatic spot detection of biological markers from microscopic images

    PubMed Central

    2010-01-01

    Background A reliable extraction technique for resolving multiple spots in light or electron microscopic images is essential in investigations of the spatial distribution and dynamics of specific proteins inside cells and tissues. Currently, automatic spot extraction and characterization in complex microscopic images poses many challenges to conventional image processing methods. Results A new method to extract closely located, small target spots from biological images is proposed. This method starts with a simple but practical operation based on the extended morphological top-hat transformation to subtract an uneven background. The core of our novel approach is the following: first, the original image is rotated in an arbitrary direction and each rotated image is opened with a single straight line-segment structuring element. Second, the opened images are unified and then subtracted from the original image. To evaluate these procedures, model images of simulated spots with closely located targets were created and the efficacy of our method was compared to that of conventional morphological filtering methods. The results showed the better performance of our method. The spots of real microscope images can be quantified to confirm that the method is applicable in a given practice. Conclusions Our method achieved effective spot extraction under various image conditions, including aggregated target spots, poor signal-to-noise ratio, and large variations in the background intensity. Furthermore, it has no restrictions with respect to the shape of the extracted spots. The features of our method allow its broad application in biological and biomedical image information analysis. PMID:20615231

  8. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  9. [Methodical approaches to usage of complex anthropometric methods in clinical practice].

    PubMed

    Bukavneva, N S; Pozdniakov, A L; Nikitiuk, D B

    2007-01-01

    The new methodical approach of complex anthropometric study in clinical practice has been proposed for evaluation of nutritional state, dyagnostics and effectiveness of dietotherapy of patients with alimentary-depended pathology. The technique of body's voluminous size measurements, adipose folds measurements by means of caliper, extremities diameter measurements has been described, which would allow to receive more precise data during patients examinations. Formulas which allow to calculate the amount of bone, muscular and adipose mass been provided. PMID:18219935

  10. Practical Framework: Implementing OEE Method in Manufacturing Process Environment

    NASA Astrophysics Data System (ADS)

    Maideen, N. C.; Sahudin, S.; Mohd Yahya, N. H.; Norliawati, A. O.

    2016-02-01

    Manufacturing process environment requires reliable machineries in order to be able to satisfy the market demand. Ideally, a reliable machine is expected to be operated and produce a quality product at its maximum designed capability. However, due to some reason, the machine usually unable to achieved the desired performance. Since the performance will affect the productivity of the system, a measurement technique should be applied. Overall Equipment Effectiveness (OEE) is a good method to measure the performance of the machine. The reliable result produced from OEE can then be used to propose a suitable corrective action. There are a lot of published paper mentioned about the purpose and benefit of OEE that covers what and why factors. However, the how factor not yet been revealed especially the implementation of OEE in manufacturing process environment. Thus, this paper presents a practical framework to implement OEE and a case study has been discussed to explain in detail each steps proposed. The proposed framework is beneficial to the engineer especially the beginner to start measure their machine performance and later improve the performance of the machine.

  11. Practical Method for Transient Stability with Unbalanced Condition based on Symmetric Coordinates

    NASA Astrophysics Data System (ADS)

    Fujiwara, Shuhei; Kono, Yoshiyuki; Kitayama, Masashi; Goda, Tadahiro

    The symmetric coordinates are very popular method to model unbalanced faults in power system analysis. It is not only easy to handle with a single fault, but also it can be extended to multiple faults. But it is not easy to model situations that those unbalanced situation will continuously change, like a SVC (Static Var Compensator) with unbalanced fault in power system or an unbalanced nonlinear load. Under these situations, we propose a practical use of multiple fault calculation method based on symmetric coordinates that can handle with these kinds of unbalanced situations.

  12. Practical limitations of the slope assisted BOTDA method in dynamic strain sensing

    NASA Astrophysics Data System (ADS)

    Minardo, A.; Catalano, E.; Zeni, L.

    2016-05-01

    By analyzing the operation of the slope assisted Brillouin Optical Time-Domain Analysis (BOTDA) method, it comes out that the acquisition rate is practically limited by two fundamental factors: the polarization scrambling frequency and the phase noise from the laser. As regards polarization scrambling, we show experimentally that the scrambling frequency poses a limit on the maximum acquisition rate for a given averaging factor. As regards phase noise, we show numerically and experimentally that the slope assisted method is particularly sensitive to the laser phase noise, due to the specific positioning of the pump-probe frequency shift on the Brillouin Gain Spectrum (BGS).

  13. Systems analysis and design methodologies: practicalities and use in today's information systems development efforts.

    PubMed

    Jerva, M

    2001-05-01

    Historically, systems analysis and design methodologies have been used as a guide in software development. Such methods provide structure to software engineers in their efforts to create quality solutions in the real world of information systems. This article looks at the elements that constitute a systems analysis methodology and examines the historical development of systems analysis in software development. It concludes with observations on the strengths and weaknesses of four methodologies and the state of the art of practice today. PMID:11378979

  14. Probabilistic methods for structural response analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Burnside, O. H.; Cruse, T. A.

    1988-01-01

    This paper addresses current work to develop probabilistic structural analysis methods for integration with a specially developed probabilistic finite element code. The goal is to establish distribution functions for the structural responses of stochastic structures under uncertain loadings. Several probabilistic analysis methods are proposed covering efficient structural probabilistic analysis methods, correlated random variables, and response of linear system under stationary random loading.

  15. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    ERIC Educational Resources Information Center

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we…

  16. [Practice analysis: culture shock and adaptation at work].

    PubMed

    Philippe, Séverine; Didry, Pascale

    2015-12-01

    Constructed as a practice analysis, this personal account presents the reflection undertaken by a student on placement in Ireland thanks to the Erasmus programme. She describes in detail the stages of her adaptation in a hospital setting which is considerably different to her usual environment. PMID:26654501

  17. A Model of Practice in Special Education: Dynamic Ecological Analysis

    ERIC Educational Resources Information Center

    Hannant, Barbara; Lim, Eng Leong; McAllum, Ruth

    2010-01-01

    Dynamic Ecological Analysis (DEA) is a model of practice which increases a teams' efficacy by enabling the development of more effective interventions through collaboration and collective reflection. This process has proved to be useful in: a) clarifying thinking and problem-solving, b) transferring knowledge and thinking to significant parties,…

  18. Procedural Fidelity: An Analysis of Measurement and Reporting Practices

    ERIC Educational Resources Information Center

    Ledford, Jennifer R.; Wolery, Mark

    2013-01-01

    A systematic analysis was conducted of measurement and reporting practices related to procedural fidelity in single-case research for the past 30 years. Previous reviews of fidelity primarily reported whether fidelity data were collected by authors; these reviews reported that collection was variable, but low across journals and over time. Results…

  19. Report on the National Art Therapy Practice Analysis Survey.

    ERIC Educational Resources Information Center

    Knapp, Joan E.; And Others

    1994-01-01

    Art therapy practice analysis surveys were completed by 1,125 Registered Art Therapists. Most respondents were females, Caucasian, and graduates of master's degree programs in art therapy. Respondents rated "creating therapeutic environment" as most important major responsibility of entry-level as therapists. (NB)

  20. Perceptions of Weight and Health Practices in Hispanic Children: A Mixed-Methods Study

    PubMed Central

    Foster, Byron Alexander; Hale, Daniel

    2015-01-01

    Background. Perception of weight by parents of obese children may be associated with willingness to engage in behavior change. The relationship between parents' perception of their child's weight and their health beliefs and practices is poorly understood, especially among the Hispanic population which experiences disparities in childhood obesity. This study sought to explore the relationship between perceptions of weight and health beliefs and practices in a Hispanic population. Methods. A cross-sectional, mixed-methods approach was used with semistructured interviews conducted with parent-child (2–5 years old) dyads in a primarily Hispanic, low-income population. Parents were queried on their perceptions of their child's health, health practices, activities, behaviors, and beliefs. A grounded theory approach was used to analyze participants' discussion of health practices and behaviors. Results. Forty parent-child dyads completed the interview. Most (58%) of the parents of overweight and obese children misclassified their child's weight status. The qualitative analysis showed that accurate perception of weight was associated with internal motivation and more concrete ideas of what healthy meant for their child. Conclusions. The qualitative data suggest there may be populations at different stages of readiness for change among parents of overweight and obese children, incorporating this understanding should be considered for interventions. PMID:26379715

  1. A practical and sensitive method of quantitating lymphangiogenesis in vivo.

    PubMed

    Majumder, Mousumi; Xin, Xiping; Lala, Peeyush K

    2013-07-01

    To address the inadequacy of current assays, we developed a directed in vivo lymphangiogenesis assay (DIVLA) by modifying an established directed in vivo angiogenesis assay. Silicon tubes (angioreactors) were implanted in the dorsal flanks of nude mice. Tubes contained either growth factor-reduced basement membrane extract (BME)-alone (negative control) or BME-containing vascular endothelial growth factor (VEGF)-D (positive control for lymphangiogenesis) or FGF-2/VEGF-A (positive control for angiogenesis) or a high VEGF-D-expressing breast cancer cell line MDA-MD-468LN (468-LN), or VEGF-D-silenced 468LN. Lymphangiogenesis was detected superficially with Evans Blue dye tracing and measured in the cellular contents of angioreactors by multiple approaches: lymphatic vessel endothelial hyaluronan receptor-1 (Lyve1) protein (immunofluorescence) and mRNA (qPCR) expression and a visual scoring of lymphatic vs blood capillaries with dual Lyve1 (or PROX-11 or Podoplanin)/Cd31 immunostaining in cryosections. Lymphangiogenesis was absent with BME, high with VEGF-D or VEGF-D-producing 468LN cells and low with VEGF-D-silenced 468LN. Angiogenesis was absent with BME, high with FGF-2/VEGF-A, moderate with 468LN or VEGF-D and low with VEGF-D-silenced 468LN. The method was reproduced in a syngeneic murine C3L5 tumor model in C3H/HeJ mice with dual Lyve1/Cd31 immunostaining. Thus, DIVLA presents a practical and sensitive assay of lymphangiogenesis, validated with multiple approaches and markers. It is highly suited to identifying pro- and anti-lymphangiogenic agents, as well as shared or distinct mechanisms regulating lymphangiogenesis vs angiogenesis, and is widely applicable to research in vascular/tumor biology. PMID:23711825

  2. Practicing oncology in provincial Mexico: a narrative analysis.

    PubMed

    Hunt, L M

    1994-03-01

    This paper examines the discourse of oncologists treating cancer in a provincial capital of southern Mexico. Based on an analysis of both formal interviews and observations of everyday clinical practice, it examines a set of narrative themes they used to maintain a sense of professionalism and possibility as they endeavored to apply a highly technologically dependent biomedical model in a resource-poor context. They moved between coexisting narrative frameworks as they addressed their formidable problems of translating between theory and practice. In a biomedical narrative frame, they drew on biomedical theory to produce a model of cellular dysfunction and of clinical intervention. However, limited availability of diagnostic and treatment techniques and patients inability or unwillingness to comply, presented serious constraints to the application of this model. They used a practical narrative frame to discuss the socio-economic issues they understood to be underlying these limitations to their clinical practice. They did not experience the incongruity between theory and practice as a continual challenge to their biomedical model, nor to their professional competency. Instead, through a reconciling narrative frame, they mediated this conflict. In this frame, they drew on culturally specific concepts of moral rightness and order to produce accounts that minimized the problem, exculpated themselves and cast blame for failed diagnosis and treatment. By invoking these multiple, coexisting narrative themes, the oncologists sustained an open vision of their work in which deficiencies and impotency were vindicated, and did not stand in the way of clinical practice. PMID:8184335

  3. Ethnographic Analysis of Instructional Method.

    ERIC Educational Resources Information Center

    Brooks, Douglas M.

    1980-01-01

    Instructional methods are operational exchanges between participants within environments that attempt to produce a learning outcome. The classroom teacher's ability to produce a learning outcome is the measure of instructional competence within that learning method. (JN)

  4. Methods of Building Cost Analysis.

    ERIC Educational Resources Information Center

    Building Research Inst., Inc., Washington, DC.

    Presentation of symposium papers includes--(1) a study describing techniques for economic analysis of building designs, (2) three case studies of analysis techniques, (3) procedures for measuring the area and volume of buildings, and (4) an open forum discussion. Case studies evaluate--(1) the thermal economics of building enclosures, (2) an…

  5. Methods of stability analysis in nonlinear mechanics

    SciTech Connect

    Warnock, R.L.; Ruth, R.D.; Gabella, W.; Ecklund, K.

    1989-01-01

    We review our recent work on methods to study stability in nonlinear mechanics, especially for the problems of particle accelerators, and compare our ideals to those of other authors. We emphasize methods that (1) show promise as practical design tools, (2) are effective when the nonlinearity is large, and (3) have a strong theoretical basis. 24 refs., 2 figs., 2 tabs.

  6. Adapting the Six Category Intervention Analysis To Promote Facilitative Type Supervisory Feedback in Teaching Practice.

    ERIC Educational Resources Information Center

    Hamid, Bahiyah Abdul; Azman, Hazita

    A discussion of the supervision preservice language teacher trainees focuses on supervisory methods designed to facilitate clear, useful, enabling feedback to the trainee. Specifically, it looks at use of the Six Category Intervention Analysis, a model for interpersonal skills training, for supervision of teaching practice. The model is seen here…

  7. A Meta-Analysis of Published School Social Work Practice Studies: 1980-2007

    ERIC Educational Resources Information Center

    Franklin, Cynthia; Kim, Johnny S.; Tripodi, Stephen J.

    2009-01-01

    Objective: This systematic review examined the effectiveness of school social work practices using meta-analytic techniques. Method: Hierarchical linear modeling software was used to calculate overall effect size estimates as well as test for between-study variability. Results: A total of 21 studies were included in the final analysis.…

  8. Adapting Job Analysis Methodology to Improve Evaluation Practice

    ERIC Educational Resources Information Center

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  9. Content Analysis as a Best Practice in Technical Communication Research

    ERIC Educational Resources Information Center

    Thayer, Alexander; Evans, Mary; McBride, Alicia; Queen, Matt; Spyridakis, Jan

    2007-01-01

    Content analysis is a powerful empirical method for analyzing text, a method that technical communicators can use on the job and in their research. Content analysis can expose hidden connections among concepts, reveal relationships among ideas that initially seem unconnected, and inform the decision-making processes associated with many technical…

  10. Prognostic Analysis System and Methods of Operation

    NASA Technical Reports Server (NTRS)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)

    2014-01-01

    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  11. Coal Field Fire Fighting - Practiced methods, strategies and tactics

    NASA Astrophysics Data System (ADS)

    Wündrich, T.; Korten, A. A.; Barth, U. H.

    2009-04-01

    achieved. For an effective and efficient fire fighting optimal tactics are requiered and can be divided into four fundamental tactics to control fire hazards: - Defense (digging away the coal, so that the coal can not begin to burn; or forming a barrier, so that the fire can not reach the not burning coal), - Rescue the coal (coal mining of a not burning seam), - Attack (active and direct cooling of burning seam), - Retreat (only monitoring till self-extinction of a burning seam). The last one is used when a fire exceeds the organizational and/or technical scope of a mission. In other words, "to control a coal fire" does not automatically and in all situations mean "to extinguish a coal fire". Best-practice tactics or a combination of them can be selected for control of a particular coal fire. For the extinguishing works different extinguishing agents are available. They can be applied by different application techniques and varying distinctive operating expenses. One application method may be the drilling of boreholes from the surface or covering the surface with low permeability soils. The mainly used extinction agents for coal field fire are as followed: Water (with or without additives), Slurry, Foaming mud/slurry, Inert gases, Dry chemicals and materials and Cryogenic agents. Because of its tremendous dimension and its complexity the worldwide challenge of coal fires is absolutely unique - it can only be solved with functional application methods, best fitting strategies and tactics, organisation and research as well as the dedication of the involved fire fighters, who work under extreme individual risks on the burning coal fields.

  12. Convergence analysis of combinations of different methods

    SciTech Connect

    Kang, Y.

    1994-12-31

    This paper provides a convergence analysis for combinations of different numerical methods for solving systems of differential equations. The author proves that combinations of two convergent linear multistep methods or Runge-Kutta methods produce a new convergent method of which the order is equal to the smaller order of the two original methods.

  13. Practical Aspects of the Equation-Error Method for Aircraft Parameter Estimation

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene a.

    2006-01-01

    Various practical aspects of the equation-error approach to aircraft parameter estimation were examined. The analysis was based on simulated flight data from an F-16 nonlinear simulation, with realistic noise sequences added to the computed aircraft responses. This approach exposes issues related to the parameter estimation techniques and results, because the true parameter values are known for simulation data. The issues studied include differentiating noisy time series, maximum likelihood parameter estimation, biases in equation-error parameter estimates, accurate computation of estimated parameter error bounds, comparisons of equation-error parameter estimates with output-error parameter estimates, analyzing data from multiple maneuvers, data collinearity, and frequency-domain methods.

  14. Efficient methods and practical guidelines for simulating isotope effects.

    PubMed

    Ceriotti, Michele; Markland, Thomas E

    2013-01-01

    The shift in chemical equilibria due to isotope substitution is frequently exploited to obtain insight into a wide variety of chemical and physical processes. It is a purely quantum mechanical effect, which can be computed exactly using simulations based on the path integral formalism. Here we discuss how these techniques can be made dramatically more efficient, and how they ultimately outperform quasi-harmonic approximations to treat quantum liquids not only in terms of accuracy, but also in terms of computational cost. To achieve this goal we introduce path integral quantum mechanics estimators based on free energy perturbation, which enable the evaluation of isotope effects using only a single path integral molecular dynamics trajectory of the naturally abundant isotope. We use as an example the calculation of the free energy change associated with H/D and (16)O/(18)O substitutions in liquid water, and of the fractionation of those isotopes between the liquid and the vapor phase. In doing so, we demonstrate and discuss quantitatively the relative benefits of each approach, thereby providing a set of guidelines that should facilitate the choice of the most appropriate method in different, commonly encountered scenarios. The efficiency of the estimators we introduce and the analysis that we perform should in particular facilitate accurate ab initio calculation of isotope effects in condensed phase systems. PMID:23298033

  15. Convex geometry analysis method of hyperspectral data

    NASA Astrophysics Data System (ADS)

    Gong, Yanjun; Wang, XiChang; Qi, Hongxing; Yu, BingXi

    2003-06-01

    We present matrix expression of convex geometry analysis method of hyperspectral data by linear mixing model and establish a mathematic model of endmembers. A 30-band remote sensing image is applied to testify the model. The results of analysis reveal that the method can analyze mixed pixel questions. The targets that are smaller than earth surface pixel can be identified by applying the method.

  16. Measurement Practices: Methods for Developing Content-Valid Student Examinations.

    ERIC Educational Resources Information Center

    Bridge, Patrick D.; Musial, Joseph; Frank, Robert; Roe, Thomas; Sawilowsky, Shlomo

    2003-01-01

    Reviews the fundamental principles associated with achieving a high level of content validity when developing tests for students. Suggests that the short-term efforts necessary to develop and integrate measurement theory into practice will lead to long-term gains for students, faculty, and academic institutions. (Includes 21 references.)…

  17. Parallel Processable Cryptographic Methods with Unbounded Practical Security.

    ERIC Educational Resources Information Center

    Rothstein, Jerome

    Addressing the problem of protecting confidential information and data stored in computer databases from access by unauthorized parties, this paper details coding schemes which present such astronomical work factors to potential code breakers that security breaches are hopeless in any practical sense. Two procedures which can be used to encode for…

  18. Learning by the Case Method: Practical Approaches for Community Leaders.

    ERIC Educational Resources Information Center

    Stenzel, Anne K.; Feeney, Helen M.

    This supplement to Volunteer Training and Development: A Manual for Community Groups, provides practical guidance in the selection, writing, and adaptation of effective case materials for specific educational objectives, and develops suitable cases for use by analyzing concrete situations and by offering illustrations of various types. An…

  19. Methods of DNA methylation analysis.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The purpose of this review was to provide guidance for investigators who are new to the field of DNA methylation analysis. Epigenetics is the study of mitotically heritable alterations in gene expression potential that are not mediated by changes in DNA sequence. Recently, it has become clear that n...

  20. Governance of professional nursing practice in a hospital setting: a mixed methods study1

    PubMed Central

    dos Santos, José Luís Guedes; Erdmann, Alacoque Lorenzini

    2015-01-01

    Objective: to elaborate an interpretative model for the governance of professional nursing practice in a hospital setting. Method: a mixed methods study with concurrent triangulation strategy, using data from a cross-sectional study with 106 nurses and a Grounded Theory study with 63 participants. The quantitative data were collected through the Brazilian Nursing Work Index - Revised and underwent descriptive statistical analysis. Qualitative data were obtained from interviews and analyzed through initial, selective and focused coding. Results: based on the results obtained with the Brazilian Nursing Work Index - Revised, it is possible to state that nurses perceived that they had autonomy, control over the environment, good relationships with physicians and organizational support for nursing governance. The governance of the professional nursing practice is based on the management of nursing care and services carried out by the nurses. To perform these tasks, nurses aim to get around the constraints of the organizational support and develop management knowledge and skills. Conclusion: it is important to reorganize the structures and processes of nursing governance, especially the support provided by the organization for the management practices of nurses. PMID:26625992

  1. Hybrid methods for cybersecurity analysis :

    SciTech Connect

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling and analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and

  2. Analysis methods for photovoltaic applications

    SciTech Connect

    1980-01-01

    Because photovoltaic power systems are being considered for an ever-widening range of applications, it is appropriate for system designers to have knowledge of and access to photovoltaic power systems simulation models and design tools. This brochure gives brief descriptions of a variety of such aids and was compiled after surveying both manufacturers and researchers. Services available through photovoltaic module manufacturers are outlined, and computer codes for systems analysis are briefly described. (WHK)

  3. Practical design methods for barrier pillars. Information circular/1995

    SciTech Connect

    Koehler, J.R.; Tadolini, S.C.

    1995-11-01

    Effective barrier pillar design is essential for safe and productive underground coal mining. This U.S. Bureau of Mines report presents an overview of available barrier pillar design methodologies that incorporate sound engineering principles while remaining practical for everyday usage. Nomographs and examples are presented to assist in the determination of proper barrier pillar sizing. Additionally, performance evaluation techniques and criteria are included to assist in determining the effectiveness of selected barrier pillar configurations.

  4. A Monte Carlo method for combined segregation and linkage analysis.

    PubMed Central

    Guo, S W; Thompson, E A

    1992-01-01

    We introduce a Monte Carlo approach to combined segregation and linkage analysis of a quantitative trait observed in an extended pedigree. In conjunction with the Monte Carlo method of likelihood-ratio evaluation proposed by Thompson and Guo, the method provides for estimation and hypothesis testing. The greatest attraction of this approach is its ability to handle complex genetic models and large pedigrees. Two examples illustrate the practicality of the method. One is of simulated data on a large pedigree; the other is a reanalysis of published data previously analyzed by other methods. PMID:1415253

  5. A Mixed-Method Approach to Investigating the Adoption of Evidence-Based Pain Practices in Nursing Homes

    PubMed Central

    Ersek, Mary; Jablonski, Anita

    2014-01-01

    This mixed methods study examined perceived facilitators and obstacles to adopting evidence-based pain management protocols vis-a-vis documented practice changes that were measured using a chart audit tool. This analysis used data from a subgroup of four nursing homes that participated in a clinical trial. Focus group interviews with staff yielded qualitative data about perceived factors that affected their willingness and ability to use the protocols. Chart audits determined whether pain assessment and management practices changed over time in light of these reported facilitators and barriers. Reported facilitators included administrative support, staff consistency, and policy and procedure changes. Barriers were staff attitudes, regulatory issues, and provider mistrust of nurses’ judgment. Overall, staff reported improvements in pain practices. These reports were corroborated by modest but significant increases in adherence to recommended practices. Change in clinical practice is complex and requires attention to both structural and process aspects of care. PMID:24640959

  6. A mixed-methods approach to investigating the adoption of evidence-based pain practices in nursing homes.

    PubMed

    Ersek, Mary; Jablonski, Anita

    2014-07-01

    This mixed methods study examined perceived facilitators and obstacles to adopting evidence-based pain management protocols vis-a-vis documented practice changes that were measured using a chart audit tool. This analysis used data from a subgroup of four nursing homes that participated in a clinical trial. Focus group interviews with staff yielded qualitative data about perceived factors that affected their willingness and ability to use the protocols. Chart audits determined whether pain assessment and management practices changed over time in light of these reported facilitators and barriers. Reported facilitators included administrative support, staff consistency, and policy and procedure changes. Barriers were staff attitudes, regulatory issues, and provider mistrust of nurses' judgment. Overall, staff reported improvements in pain practices. These reports were corroborated by modest but significant increases in adherence to recommended practices. Change in clinical practice is complex and requires attention to both structural and process aspects of care. PMID:24640959

  7. Microparticle analysis system and method

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    2007-01-01

    A device for analyzing microparticles is provided which includes a chamber with an inlet and an outlet for respectively introducing and dispensing a flowing fluid comprising microparticles, a light source for providing light through the chamber and a photometer for measuring the intensity of light transmitted through individual microparticles. The device further includes an imaging system for acquiring images of the fluid. In some cases, the device may be configured to identify and determine a quantity of the microparticles within the fluid. Consequently, a method for identifying and tracking microparticles in motion is contemplated herein. The method involves flowing a fluid comprising microparticles in laminar motion through a chamber, transmitting light through the fluid, measuring the intensities of the light transmitted through the microparticles, imaging the fluid a plurality of times and comparing at least some of the intensities of light between different images of the fluid.

  8. Current status of methods for shielding analysis

    SciTech Connect

    Engle, W.W.

    1980-01-01

    Current methods used in shielding analysis and recent improvements in those methods are discussed. The status of methods development is discussed based on needs cited at the 1977 International Conference on Reactor Shielding. Additional areas where methods development is needed are discussed.

  9. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  10. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2012-05-29

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  11. Comparison of Manual Versus Automated Data Collection Method for an Evidence-Based Nursing Practice Study

    PubMed Central

    Byrne, M.D.; Jordan, T.R.; Welle, T.

    2013-01-01

    Objective The objective of this study was to investigate and improve the use of automated data collection procedures for nursing research and quality assurance. Methods A descriptive, correlational study analyzed 44 orthopedic surgical patients who were part of an evidence-based practice (EBP) project examining post-operative oxygen therapy at a Midwestern hospital. The automation work attempted to replicate a manually-collected data set from the EBP project. Results Automation was successful in replicating data collection for study data elements that were available in the clinical data repository. The automation procedures identified 32 “false negative” patients who met the inclusion criteria described in the EBP project but were not selected during the manual data collection. Automating data collection for certain data elements, such as oxygen saturation, proved challenging because of workflow and practice variations and the reliance on disparate sources for data abstraction. Automation also revealed instances of human error including computational and transcription errors as well as incomplete selection of eligible patients. Conclusion Automated data collection for analysis of nursing-specific phenomenon is potentially superior to manual data collection methods. Creation of automated reports and analysis may require initial up-front investment with collaboration between clinicians, researchers and information technology specialists who can manage the ambiguities and challenges of research and quality assurance work in healthcare. PMID:23650488

  12. [Dental hygiene indices for dental practice (methods and experiences)].

    PubMed

    Hiltbold, B

    1976-10-01

    An oral hygiene recording sheet for clinical practices with a dental hygienist is described. The recording sheets allow an easy and clear survey of the present oral hygiene status as well as progress or negligence in the performance of oral hygiene procedures. Several oral hygiene indices are described, three of which are recommended for routine examinations: The plaque index of Silness/Löe, the sulcus bleeding index of Mühlemann/Son and the calculus surface index of Ennever et al. The experience of a 3-year use of the oral hygiene recording sheets is described. PMID:1070805

  13. A Practical Guide to Interpretation of Large Collections of Incident Narratives Using the QUORUM Method

    NASA Technical Reports Server (NTRS)

    McGreevy, Michael W.

    1997-01-01

    Analysis of incident reports plays an important role in aviation safety. Typically, a narrative description, written by a participant, is a central part of an incident report. Because there are so many reports, and the narratives contain so much detail, it can be difficult to efficiently and effectively recognize patterns among them. Recognizing and addressing recurring problems, however, is vital to continuing safety in commercial aviation operations. A practical way to interpret large collections of incident narratives is to apply the QUORUM method of text analysis, modeling, and relevance ranking. In this paper, QUORUM text analysis and modeling are surveyed, and QUORUM relevance ranking is described in detail with many examples. The examples are based on several large collections of reports from the Aviation Safety Reporting System (ASRS) database, and a collection of news stories describing the disaster of TWA Flight 800, the Boeing 747 which exploded in mid- air and crashed near Long Island, New York, on July 17, 1996. Reader familiarity with this disaster should make the relevance-ranking examples more understandable. The ASRS examples illustrate the practical application of QUORUM relevance ranking.

  14. Analysis of swimming performance: perceptions and practices of US-based swimming coaches.

    PubMed

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; Newell, John; Quinlan, Leo Richard; ÓLaighin, Gearóid

    2016-06-01

    In elite swimming, a broad range of methods are used to assess performance, inform coaching practices and monitor athletic progression. The aim of this paper was to examine the performance analysis practices of swimming coaches and to explore the reasons behind the decisions that coaches take when analysing performance. Survey data were analysed from 298 Level 3 competitive swimming coaches (245 male, 53 female) based in the United States. Results were compiled to provide a generalised picture of practices and perceptions and to examine key emerging themes. It was found that a disparity exists between the importance swim coaches place on biomechanical analysis of swimming performance and the types of analyses that are actually conducted. Video-based methods are most frequently employed, with over 70% of coaches using these methods at least monthly, with analyses being mainly qualitative in nature rather than quantitative. Barriers to the more widespread use of quantitative biomechanical analysis in elite swimming environments were explored. Constraints include time, cost and availability of resources, but other factors such as sources of information on swimming performance and analysis and control over service provision are also discussed, with particular emphasis on video-based methods and emerging sensor-based technologies. PMID:26359951

  15. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures

    PubMed Central

    Rock, Adam J.; Coventry, William L.; Morgan, Methuen I.; Loi, Natasha M.

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology. PMID:27014147

  16. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures.

    PubMed

    Rock, Adam J; Coventry, William L; Morgan, Methuen I; Loi, Natasha M

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology. PMID:27014147

  17. Canonical Correlation Analysis: An Explanation with Comments on Correct Practice.

    ERIC Educational Resources Information Center

    Thompson, Bruce

    This paper briefly explains the logic underlying the basic calculations employed in canonical correlation analysis. A small hypothetical data set is employed to illustrate that canonical correlation analysis subsumes both univariate and multivariate parametric methods. Several real data sets are employed to illustrate other themes. Three common…

  18. Scharz Preconditioners for Krylov Methods: Theory and Practice

    SciTech Connect

    Szyld, Daniel B.

    2013-05-10

    Several numerical methods were produced and analyzed. The main thrust of the work relates to inexact Krylov subspace methods for the solution of linear systems of equations arising from the discretization of partial di erential equa- tions. These are iterative methods, i.e., where an approximation is obtained and at each step. Usually, a matrix-vector product is needed at each iteration. In the inexact methods, this product (or the application of a preconditioner) can be done inexactly. Schwarz methods, based on domain decompositions, are excellent preconditioners for thise systems. We contributed towards their under- standing from an algebraic point of view, developed new ones, and studied their performance in the inexact setting. We also worked on combinatorial problems to help de ne the algebraic partition of the domains, with the needed overlap, as well as PDE-constraint optimization using the above-mentioned inexact Krylov subspace methods.

  19. The Qualitative Method of Impact Analysis.

    ERIC Educational Resources Information Center

    Mohr, Lawrence B.

    1999-01-01

    Discusses qualitative methods of impact analysis and provides an introductory treatment of one such approach. Combines an awareness of an alternative causal epistemology with current knowledge of qualitative methods of data collection and measurement to produce an approach to the analysis of impacts. (SLD)

  20. Practical methods for detecting mendacity: a case study.

    PubMed

    Hirsch, A R; Wolf, C J

    2001-01-01

    This study demonstrates the concurrence of the use of objective verbal and nonverbal signs and lying. President Clinton's Grand jury Testimony of August 17, 1998, was examined for the presence of 23 clinically practical signs of dissimulation selected from 64 peer-reviewed articles and 20 books on mendacity. A segment of his testimony that was subsequently found to be false was compared with a control period during the same testimony (internal control). A fund-raising speech to a sympathetic crowd served as a second control (external control). The frequencies of the 23 signs in the mendacious speech were compared with their frequencies during the control periods, and the differences were analyzed for statistical significance. No clinical examination was performed nor diagnosis assigned. During the mendacious speech, the subject markedly increased the frequency of 20 out of 23 signs compared with their frequency during the fund-raising control speech (p < .0005). He increased the frequency of 19 signs compared with their frequency during the control period of the same testimony (p < .003). The 23 signs may be useful as indicators of the veracity of videotaped and scripted testimony. If these findings are confirmed through further testing, they could, with practice, be used by psychiatrists conducting interviews. PMID:11785615

  1. Vibration analysis methods for piping

    NASA Astrophysics Data System (ADS)

    Gibert, R. J.

    1981-09-01

    Attention is given to flow vibrations in pipe flow induced by singularity points in the piping system. The types of pressure fluctuations induced by flow singularities are examined, including the intense wideband fluctuations immediately downstream of the singularity and the acoustic fluctuations encountered in the remainder of the circuit, and a theory of noise generation by unsteady flow in internal acoustics is developed. The response of the piping systems to the pressure fluctuations thus generated is considered, and the calculation of the modal characteristics of piping containing a dense fluid in order to obtain the system transfer function is discussed. The TEDEL program, which calculates the vibratory response of a structure composed of straight and curved pipes with variable mechanical characteristics forming a three-dimensional network by a finite element method, is then presented, and calculations of fluid-structural coupling in tubular networks are illustrated.

  2. Estimating free-living human energy expenditure: Practical aspects of the doubly labeled water method and its applications

    PubMed Central

    Kazuko, Ishikawa-Takata; Kim, Eunkyung; Kim, Jeonghyun; Yoon, Jinsook

    2014-01-01

    The accuracy and noninvasive nature of the doubly labeled water (DLW) method makes it ideal for the study of human energy metabolism in free-living conditions. However, the DLW method is not always practical in many developing and Asian countries because of the high costs of isotopes and equipment for isotope analysis as well as the expertise required for analysis. This review provides information about the theoretical background and practical aspects of the DLW method, including optimal dose, basic protocols of two- and multiple-point approaches, experimental procedures, and isotopic analysis. We also introduce applications of DLW data, such as determining the equations of estimated energy requirement and validation studies of energy intake. PMID:24944767

  3. An analysis of revenues and expenses in a hospital-based ambulatory pediatric practice.

    PubMed

    Berkelhamer, J E; Rojek, K J

    1988-05-01

    We developed a method of analyzing revenues and expenses in a hospital-based ambulatory pediatric practice. Results of an analysis of the Children's Medical Group (CMG) at the University of Chicago Medical Center demonstrate how changes in collection rates, practice expenses, and hospital underwriting contribute to the financial outcome of the practice. In this analysis, certain programmatic goals of the CMG are achieved at a level of just under 12,000 patient visits per year. At this activity level, pediatric residency program needs are met and income to the CMG physicians is maximized. An ethical problem from the physician's perspective is created by seeking profit maximization. To accomplish this end, the CMG physicians would have to restrict their personal services to only the better-paying patients. This study serves to underscore the importance of hospital-based physicians and hospital administrators structuring fiscal incentives for physicians that mutually meet the institutional goals for the hospital and its physicians. PMID:3358399

  4. [Pedagogical practices in nursing teaching: a study from the perspective of institutional analysis].

    PubMed

    Pereira, Wilza Rocha; Tavares, Cláudia Mara Melo

    2010-12-01

    The general objective of this study was to learn about the pedagogical practices that are already in use in nursing teaching in order to identify and analyze those that have brought changes and innovation. This field study used a qualitative and comparative approach, and the subjects were nursing professors and students. The data was collected through individual interviews and focal groups. Data analysis was based on the Institutional Analysis method. Several pedagogical practices were recognized, from the most traditional to those considered innovative, and it was noticed that changes are already present and are part of a set of elements caused by the obsolescence of values that are now considered to be insufficient or inappropriate by professors themselves. The study revealed that the activity of teaching and the qualification of the pedagogical practices are always desired by professors. PMID:21337793

  5. Probabilistic structural analysis by extremum methods

    NASA Technical Reports Server (NTRS)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  6. Strength-based Supervision: Frameworks, Current Practice, and Future Directions A Wu-wei Method.

    ERIC Educational Resources Information Center

    Edwards, Jeffrey K.; Chen, Mei-Whei

    1999-01-01

    Discusses a method of counseling supervision similar to the wu-wei practice in Zen and Taoism. Suggests that this strength-based method and an understanding of isomorphy in supervisory relationships are the preferred practice for the supervision of family counselors. States that this model of supervision potentiates the person-of-the-counselor.…

  7. Researching into Teaching Methods in Colleges and Universities. Practical Research Series.

    ERIC Educational Resources Information Center

    Bennett, Clinton; And Others

    This practical guide is one of a series aimed at assisting academics in higher education in researching specific aspects of their work. Focusing on small-scale insider research in colleges and universities, the handbook covers contemporary issues, research methods, and existing practice and values in the area of teaching methods. Strategies for…

  8. Investigating the Efficacy of Practical Skill Teaching: A Pilot-Study Comparing Three Educational Methods

    ERIC Educational Resources Information Center

    Maloney, Stephen; Storr, Michael; Paynter, Sophie; Morgan, Prue; Ilic, Dragan

    2013-01-01

    Effective education of practical skills can alter clinician behaviour, positively influence patient outcomes, and reduce the risk of patient harm. This study compares the efficacy of two innovative practical skill teaching methods, against a traditional teaching method. Year three pre-clinical physiotherapy students consented to participate in a…

  9. Assessing Student Perception of Practice Evaluation Knowledge in Introductory Research Methods

    ERIC Educational Resources Information Center

    Baker, Lisa R.; Pollio, David E.; Hudson, Ashley

    2011-01-01

    The authors explored the use of the Practice Evaluation Knowledge Scale (PEKS) to assess student perception of acquisition and retention of practice evaluation knowledge from an undergraduate research methods class. The authors sampled 2 semesters of undergraduate social work students enrolled in an introductory research methods course.…

  10. Learning Practice-Based Research Methods: Capturing the Experiences of MSW Students

    ERIC Educational Resources Information Center

    Natland, Sidsel; Weissinger, Erika; Graaf, Genevieve; Carnochan, Sarah

    2016-01-01

    The literature on teaching research methods to social work students identifies many challenges, such as dealing with the tensions related to producing research relevant to practice, access to data to teach practice-based research, and limited student interest in learning research methods. This is an exploratory study of the learning experiences of…

  11. Text analysis methods, text analysis apparatuses, and articles of manufacture

    DOEpatents

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  12. Overhead analysis in a surgical practice: a brief communication.

    PubMed

    Frezza, Eldo E

    2006-08-01

    Evaluating overhead is an essential part of any business, including that of the surgeon. By examining each component of overhead, the surgeon will have a better grasp of the profitability of his or her practice. The overhead discussed in this article includes health insurance, overtime, supply costs, rent, advertising and marketing, telephone costs, and malpractice insurance. While the importance of evaluating and controlling overhead in a business is well understood, few know that overhead increases do not always imply increased expenses. National standards have been provided by the Medical Group Management Association. One method of evaluating overhead is to calculate the amount spent in terms of percent of net revenue. Net revenue includes income from patients, from interest, and from insurers less refunds. Another way for surgeons to evaluate their practice is to calculate income and expenses for two years, then calculate the variance between the two years and the percentage of variance to see where they stand. PMID:16968190

  13. Application of the Maximum Entropy Method to Risk Analysis of Mergers and Acquisitions

    NASA Astrophysics Data System (ADS)

    Xie, Jigang; Song, Wenyun

    The maximum entropy (ME) method can be used to analyze the risk of mergers and acquisitions when only pre-acquisition information is available. A practical example of the risk analysis of China listed firms’ mergers and acquisitions is provided to testify the feasibility and practicality of the method.

  14. Analysis of flight equipment purchasing practices of representative air carriers

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The process through which representative air carriers decide whether or not to purchase flight equipment was investigated as well as their practices and policies in retiring surplus aircraft. An analysis of the flight equipment investment decision process in ten airlines shows that for the airline industry as a whole, the flight equipment investment decision is in a state of transition from a wholly informal process in earliest years to a much more organized and structured process in the future. Individual air carriers are in different stages with respect to the formality and sophistication associated with the flight equipment investment decision.

  15. On Practical Results of the Differential Power Analysis

    NASA Astrophysics Data System (ADS)

    Breier, Jakub; Kleja, Marcel

    2012-03-01

    This paper describes practical differential power analysis attacks. There are presented successful and unsuccessful attack attempts with the description of the attack methodology. It provides relevant information about oscilloscope settings, optimization possibilities and fundamental attack principles, which are important when realizing this type of attack. The attack was conducted on the PIC18F2420 microcontroller, using the AES cryptographic algorithm in the ECB mode with the 128-bit key length. We used two implementations of this algorithm - in the C programming language and in the assembler.

  16. Finite-key analysis of a practical decoy-state high-dimensional quantum key distribution

    NASA Astrophysics Data System (ADS)

    Bao, Haize; Bao, Wansu; Wang, Yang; Zhou, Chun; Chen, Ruike

    2016-05-01

    Compared with two-level quantum key distribution (QKD), high-dimensional QKD enables two distant parties to share a secret key at a higher rate. We provide a finite-key security analysis for the recently proposed practical high-dimensional decoy-state QKD protocol based on time-energy entanglement. We employ two methods to estimate the statistical fluctuation of the postselection probability and give a tighter bound on the secure-key capacity. By numerical evaluation, we show the finite-key effect on the secure-key capacity in different conditions. Moreover, our approach could be used to optimize parameters in practical implementations of high-dimensional QKD.

  17. Investigating the efficacy of practical skill teaching: a pilot-study comparing three educational methods.

    PubMed

    Maloney, Stephen; Storr, Michael; Paynter, Sophie; Morgan, Prue; Ilic, Dragan

    2013-03-01

    Effective education of practical skills can alter clinician behaviour, positively influence patient outcomes, and reduce the risk of patient harm. This study compares the efficacy of two innovative practical skill teaching methods, against a traditional teaching method. Year three pre-clinical physiotherapy students consented to participate in a randomised controlled trial, with concealed allocation and blinded participants and outcome assessment. Each of the three randomly allocated groups were exposed to a different practical skills teaching method (traditional, pre-recorded video tutorial or student self-video) for two specific practical skills during the semester. Clinical performance was assessed using an objective structured clinical examination (OSCE). The students were also administered a questionnaire to gain the participants level of satisfaction with the teaching method, and their perceptions of the teaching methods educational value. There were no significant differences in clinical performance between the three practical skill teaching methods as measured in the OSCE, or for student ratings of satisfaction. A significant difference existed between the methods for the student ratings of perceived educational value, with the teaching approaches of pre-recorded video tutorial and student self-video being rated higher than 'traditional' live tutoring. Alternative teaching methods to traditional live tutoring can produce equivalent learning outcomes when applied to the practical skill development of undergraduate health professional students. The use of alternative practical skill teaching methods may allow for greater flexibility for both staff and infrastructure resource allocation. PMID:22354336

  18. Theory, Method and Practice of Neuroscientific Findings in Science Education

    ERIC Educational Resources Information Center

    Liu, Chia-Ju; Chiang, Wen-Wei

    2014-01-01

    This report provides an overview of neuroscience research that is applicable for science educators. It first offers a brief analysis of empirical studies in educational neuroscience literature, followed by six science concept learning constructs based on the whole brain theory: gaining an understanding of brain function; pattern recognition and…

  19. Aural Image in Practice: A Multicase Analysis of Instrumental Practice in Middle School Learners

    ERIC Educational Resources Information Center

    Oare, Steve

    2016-01-01

    This multiple case study examined six adolescent band students engaged in self-directed practice. The students' practice sessions were videotaped. Students provided verbal reports during their practice and again retrospectively while reviewing their video immediately after practice. Students were asked to discuss their choice of practice…

  20. Focus Group Method And Methodology: Current Practice And Recent Debate

    ERIC Educational Resources Information Center

    Parker, Andrew; Tritter, Jonathan

    2006-01-01

    This paper considers the contemporary use of focus groups as a method of data collection within qualitative research settings. The authors draw upon their own experiences of using focus groups in educational and "community" user-group environments in order to provide an overview of recent issues and debates surrounding the deployment of focus…

  1. Practice and Progression in Second Language Research Methods

    ERIC Educational Resources Information Center

    Mackey, Alison

    2014-01-01

    Since its inception, the field of second language research has utilized methods from a number of areas, including general linguistics, psychology, education, sociology, anthropology and, recently, neuroscience and corpus linguistics. As the questions and objectives expand, researchers are increasingly pushing methodological boundaries to gain a…

  2. Practical method of diffusion-welding steel plate in air

    NASA Technical Reports Server (NTRS)

    Holko, K. H.; Moore, T. J.

    1971-01-01

    Method is ideal for critical service requirements where parent metal properties are equaled in notch toughness, stress rupture and other characteristics. Welding technique variations may be used on a variety of materials, such as carbon steels, alloy steels, stainless steels, ceramics, and reactive and refractory materials.

  3. Case Methods as a Bridge between Standards and Classroom Practice.

    ERIC Educational Resources Information Center

    Shulman, Judith H.

    This paper examines the function of cases and case methods in teacher education and professional development, hypothesizing that educators and administrators can better make sense of educational standards and link them to their daily school and classroom lives if they can identify cases in which those standards are inherent. One National…

  4. Methods in Educational Research: From Theory to Practice

    ERIC Educational Resources Information Center

    Lodico, Marguerite G.; Spaulding Dean T.; Voegtle, Katherine H.

    2006-01-01

    Written for students, educators, and researchers, "Methods in Educational Research" offers a refreshing introduction to the principles of educational research. Designed for the real world of educational research, the book's approach focuses on the types of problems likely to be encountered in professional experiences. Reflecting the importance of…

  5. A report on the CCNA 2007 professional practice analysis.

    PubMed

    Muckle, Timothy J; Apatov, Nathaniel M; Plaus, Karen

    2009-06-01

    The purpose of this column is to present the results of the 2007 Professional Practice Analysis (PPA) of the field of nurse anesthesia, conducted by the Council on Certification of Nurse Anesthetists. The PPA used survey and rating scale methodologies to collect data regarding the relative emphasis of various aspects of the nurse anesthesia knowledge domain and competencies. A total of 3,805 survey responses were analyzed using the Rasch rating scale model, which aggregates and transforms ordinal (rating scale) responses into linear measures of relative importance and frequency. Summaries of respondent demographics and educational and professional background are provided, as well as descriptions of how the survey results are used to develop test specifications. The results of this analysis provide evidence for the content outline and test specifications (content percentages) and thus serve as a basis of content validation for the National Certification Examination. PMID:19645167

  6. Parenting Practices and Child Misbehavior: A Mixed Method Study of Italian Mothers and Children

    PubMed Central

    Bombi, Anna Silvia; Di Norcia, Anna; Di Giunta, Laura; Pastorelli, Concetta; Lansford, Jennifer E.

    2015-01-01

    Objective The present study uses a mixed qualitative and quantitative method to examine three main research questions: What are the practices that mothers report they use when trying to correct their children’s misbehaviors? Are there common patterns of these practices? Are the patterns that emerge related to children’s well-being? Design Italian mother-child dyads (N=103) participated in the study (when children were 8 years of age). At Time 1 (T1), mothers answered open-ended questions about discipline; in addition, measures of maternal physical discipline and rejection and child aggression were assessed in mothers and children at T1, one year later (T2), and two years later (T3). Results Mothers’ answers to open-ended questions about what they would do in three disciplinary situations were classified in six categories: physical or psychological punishment, control, mix of force and reasoning, reasoning, listening, and permissiveness. Cluster analysis yielded 3 clusters: Group 1, Induction (predominant use of reasoning and listening; 74%); Group 2, Punishment (punitive practices and no reasoning; 16%); Group 3, Mixed practices (combination of reasoning and punishment, as well as high control and no listening; 10%). Multiple-group latent growth curves of maternal physical discipline, maternal rejection, and child aggression were implemented to evaluate possible differences in the developmental trends from T1 to T3, as a function of cluster. Conclusions Qualitative data deepen understanding of parenting because they shed light on what parents think about themselves; their self-descriptions, in turn, help to identify ways of parenting that may have long-lasting consequences for children’s adjustment. PMID:26877716

  7. A deliberate practice approach to teaching phylogenetic analysis.

    PubMed

    Hobbs, F Collin; Johnson, Daniel J; Kearns, Katherine D

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

  8. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    PubMed Central

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or “one-shot,” in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

  9. Practical flight test method for determining reciprocating engine cooling requirements

    NASA Technical Reports Server (NTRS)

    Ward, D. T.; Miley, S. J.

    1984-01-01

    It is pointed out that efficient and effective cooling of air-cooled reciprocating aircraft engines is a continuing problem for the general aviation industry. Miley et al. (1981) have reported results of a study regarding the controlling variables for cooling and installation aerodynamics. The present investigation is concerned with experimental methods which were developed to determine cooling requirements of an instrumented prototype or production aircraft, taking into account a flight test procedure which has been refined and further verified with additional testing. It is shown that this test procedure represents a straightforward means of determining cooling requirements with minimal instrumentation. Attention is given to some background information, the development history of the NACA cooling correlation method, and the proposed modification of the NACA cooling correlation.

  10. Theories, methods, and practice on the National Atlases of China

    NASA Astrophysics Data System (ADS)

    Qi, Qingwen

    2007-06-01

    The history of editing National Atlases in the world was summarized at first, and follows with China's achievements in editing of the 1st and 2nd version of The National Atlases of China (NAC), which reflected, in multiple levels, China's development of science and technology, society and economy, resources and environment, etc. from 1950s to 1980s. From the previous edition of NAC, systematic theories and methods were summarized and concluded, including comprehensive and statistical mapping theory, designing principle of electronic atlases, and new method the technologies involved in NAC. Then, the New Century Edition of NAC is designed, including its orientation, technological system, volume arrangement, and key scientific and technological problems to be resolved.

  11. Engaging Direct Care Providers in Improving Infection Prevention and Control Practices Using Participatory Visual Methods.

    PubMed

    Backman, Chantal; Bruce, Natalie; Marck, Patricia; Vanderloo, Saskia

    2016-01-01

    The purpose of this quality improvement project was to determine the feasibility of using provider-led participatory visual methods to scrutinize 4 hospital units' infection prevention and control practices. Methods included provider-led photo walkabouts, photo elicitation sessions, and postimprovement photo walkabouts. Nurses readily engaged in using the methods to examine and improve their units' practices and reorganize their work environment. PMID:26681499

  12. Articulating current service development practices: a qualitative analysis of eleven mental health projects

    PubMed Central

    2014-01-01

    Background The utilisation of good design practices in the development of complex health services is essential to improving quality. Healthcare organisations, however, are often seriously out of step with modern design thinking and practice. As a starting point to encourage the uptake of good design practices, it is important to understand the context of their intended use. This study aims to do that by articulating current health service development practices. Methods Eleven service development projects carried out in a large mental health service were investigated through in-depth interviews with six operation managers. The critical decision method in conjunction with diagrammatic elicitation was used to capture descriptions of these projects. Stage-gate design models were then formed to visually articulate, classify and characterise different service development practices. Results Projects were grouped into three categories according to design process patterns: new service introduction and service integration; service improvement; service closure. Three common design stages: problem exploration, idea generation and solution evaluation - were then compared across the design process patterns. Consistent across projects were a top-down, policy-driven approach to exploration, underexploited idea generation and implementation-based evaluation. Conclusions This study provides insight into where and how good design practices can contribute to the improvement of current service development practices. Specifically, the following suggestions for future service development practices are made: genuine user needs analysis for exploration; divergent thinking and innovative culture for idea generation; and fail-safe evaluation prior to implementation. Better training for managers through partnership working with design experts and researchers could be beneficial. PMID:24438471

  13. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    DOEpatents

    Sanfilippo, Antonio P.; Cowell, Andrew J.; Gregory, Michelle L.; Baddeley, Robert L.; Paulson, Patrick R.; Tratz, Stephen C.; Hohimer, Ryan E.

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  14. Triangle area water supply monitoring project, October 1988 through September 2001, North Carolina -- description of the water-quality network, sampling and analysis methods, and quality-assurance practices

    USGS Publications Warehouse

    Oblinger, Carolyn J.

    2004-01-01

    The Triangle Area Water Supply Monitoring Project was initiated in October 1988 to provide long-term water-quality data for six area water-supply reservoirs and their tributaries. In addition, the project provides data that can be used to determine the effectiveness of large-scale changes in water-resource management practices, document differences in water quality among water-supply types (large multiuse reservoir, small reservoir, run-of-river), and tributary-loading and in-lake data for water-quality modeling of Falls and Jordan Lakes. By September 2001, the project had progressed in four phases and included as many as 34 sites (in 1991). Most sites were sampled and analyzed by the U.S. Geological Survey. Some sites were already a part of the North Carolina Division of Water Quality statewide ambient water-quality monitoring network and were sampled by the Division of Water Quality. The network has provided data on streamflow, physical properties, and concentrations of nutrients, major ions, metals, trace elements, chlorophyll, total organic carbon, suspended sediment, and selected synthetic organic compounds. Project quality-assurance activities include written procedures for sample collection, record management and archive, collection of field quality-control samples (blank samples and replicate samples), and monitoring the quality of field supplies. In addition to project quality-assurance activities, the quality of laboratory analyses was assessed through laboratory quality-assurance practices and an independent laboratory quality-control assessment provided by the U.S. Geological Survey Branch of Quality Systems through the Blind Inorganic Sample Project and the Organic Blind Sample Project.

  15. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  16. Practical method for diffusion welding of steel plate in air.

    NASA Technical Reports Server (NTRS)

    Moore, T. J.; Holko, K. H.

    1972-01-01

    Description of a simple and easily applied method of diffusion welding steel plate in air which does not require a vacuum furnace or hot press. The novel feature of the proposed welding method is that diffusion welds are made in air with deadweight loading. In addition, the use of an autogenous (self-generated) surface-cleaning principle (termed 'auto-vac cleaning') to reduce the effects of surface oxides that normally hinder diffusion welding is examined. A series of nine butt joints were diffusion welded in thick sections of AISI 1020 steel plate. Diffusion welds were attempted at three welding temperatures (1200, 1090, and 980 C) using a deadweight pressure of 34,500 N/sq m (5 psi) and a two-hour hold time at temperature. Auto-vac cleaning operations prior to welding were also studied for the same three temperatures. Results indicate that sound welds were produced at the two higher temperatures when the joints were previously fusion seal welded completely around the periphery. Also, auto-vac cleaning at 1200 C for 2-1/2 hours prior to diffusion welding was highly beneficial, particularly when subsequent welding was accomplished at 1090 C.

  17. Testing the quasi-absolute method in photon activation analysis

    SciTech Connect

    Sun, Z. J.; Wells, D.; Starovoitova, V.; Segebade, C.

    2013-04-19

    In photon activation analysis (PAA), relative methods are widely used because of their accuracy and precision. Absolute methods, which are conducted without any assistance from calibration materials, are seldom applied for the difficulty in obtaining photon flux in measurements. This research is an attempt to perform a new absolute approach in PAA - quasi-absolute method - by retrieving photon flux in the sample through Monte Carlo simulation. With simulated photon flux and database of experimental cross sections, it is possible to calculate the concentration of target elements in the sample directly. The QA/QC procedures to solidify the research are discussed in detail. Our results show that the accuracy of the method for certain elements is close to a useful level in practice. Furthermore, the future results from the quasi-absolute method can also serve as a validation technique for experimental data on cross sections. The quasi-absolute method looks promising.

  18. Laboratory theory and methods for sediment analysis

    USGS Publications Warehouse

    Guy, Harold P.

    1969-01-01

    The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.

  19. Imaging laser analysis of building materials - practical examples

    SciTech Connect

    Wilsch, G.; Schaurich, D.; Wiggenhauser, H.

    2011-06-23

    The Laser induced Breakdown Spectroscopy (LIBS) is supplement and extension of standard chemical methods and SEM- or Micro-RFA-applications for the evaluation of building materials. As a laboratory method LIBS is used to gain color coded images representing composition, distribution of characteristic ions and/or ingress characteristic of damaging substances. To create a depth profile of element concentration a core has to be taken and split along the core axis. LIBS was proven to be able to detect all important elements in concrete, e. g. Chlorine, Sodium or Sulfur, which are responsible for certain degradation mechanisms and also light elements like lithium or hydrogen. Practical examples are given and a mobile system for on-site measurements is presented.

  20. Imaging Laser Analysis of Building MATERIALS—PRACTICAL Examples

    NASA Astrophysics Data System (ADS)

    Wilsch, G.; Schaurich, D.; Wiggenhauser, H.

    2011-06-01

    The Laser induced Breakdown Spectroscopy (LIBS) is supplement and extension of standard chemical methods and SEM- or Micro-RFA-applications for the evaluation of building materials. As a laboratory method LIBS is used to gain color coded images representing composition, distribution of characteristic ions and/or ingress characteristic of damaging substances. To create a depth profile of element concentration a core has to be taken and split along the core axis. LIBS was proven to be able to detect all important elements in concrete, e. g. Chlorine, Sodium or Sulfur, which are responsible for certain degradation mechanisms and also light elements like lithium or hydrogen. Practical examples are given and a mobile system for on-site measurements is presented.

  1. Documenting Elementary Teachers' Sustainability of Instructional Practices: A Mixed Method Case Study

    NASA Astrophysics Data System (ADS)

    Cotner, Bridget A.

    School reform programs focus on making educational changes; however, research on interventions past the funded implementation phase to determine what was sustained is rarely done (Beery, Senter, Cheadle, Greenwald, Pearson, et al., 2005). This study adds to the research on sustainability by determining what instructional practices, if any, of the Teaching SMARTRTM professional development program that was implemented from 2005--2008 in elementary schools with teachers in grades third through eighth were continued, discontinued, or adapted five years post-implementation (in 2013). Specifically, this study sought to answer the following questions: What do teachers who participated in Teaching SMARTRTM and district administrators share about the sustainability of Teaching SMARTRTM practices in 2013? What teaching strategies do teachers who participated in the program (2005--2008) use in their science classrooms five years postimplementation (2013)? What perceptions about the roles of females in science, technology, engineering, and mathematics (STEM) do teachers who participated in the program (2005--2008) have five years later (2013)? And, What classroom management techniques do the teachers who participated in the program (2005--2008) use five years post implementation (2013)? A mixed method approach was used to answer these questions. Quantitative teacher survey data from 23 teachers who participated in 2008 and 2013 were analyzed in SAS v. 9.3. Descriptive statistics were reported and paired t-tests were conducted to determine mean differences by survey factors identified from an exploratory factor analysis, principal axis factoring, and parallel analysis conducted with teacher survey baseline data (2005). Individual teacher change scores (2008 and 2013) for identified factors were computed using the Reliable Change Index statistic. Qualitative data consisted of interviews with two district administrators and three teachers who responded to the survey in both

  2. Practical Methods for Locating Abandoned Wells in Populated Areas

    SciTech Connect

    Veloski, G.A.; Hammack, R.W.; Lynn, R.J.

    2007-09-01

    An estimated 12 million wells have been drilled during the 150 years of oil and gas production in the United States. Many old oil and gas fields are now populated areas where the presence of improperly plugged wells may constitute a hazard to residents. Natural gas emissions from wells have forced people from their houses and businesses and have caused explosions that injured or killed people and destroyed property. To mitigate this hazard, wells must be located and properly plugged, a task made more difficult by the presence of houses, businesses, and associated utilities. This paper describes well finding methods conducted by the National Energy Technology Laboratory (NETL) that were effective at two small towns in Wyoming and in a suburb of Pittsburgh, Pennsylvania.

  3. Practical optimization of Steiner trees via the cavity method

    NASA Astrophysics Data System (ADS)

    Braunstein, Alfredo; Muntoni, Anna

    2016-07-01

    The optimization version of the cavity method for single instances, called Max-Sum, has been applied in the past to the minimum Steiner tree problem on graphs and variants. Max-Sum has been shown experimentally to give asymptotically optimal results on certain types of weighted random graphs, and to give good solutions in short computation times for some types of real networks. However, the hypotheses behind the formulation and the cavity method itself limit substantially the class of instances on which the approach gives good results (or even converges). Moreover, in the standard model formulation, the diameter of the tree solution is limited by a predefined bound, that affects both computation time and convergence properties. In this work we describe two main enhancements to the Max-Sum equations to be able to cope with optimization of real-world instances. First, we develop an alternative ‘flat’ model formulation that allows the relevant configuration space to be reduced substantially, making the approach feasible on instances with large solution diameter, in particular when the number of terminal nodes is small. Second, we propose an integration between Max-Sum and three greedy heuristics. This integration allows Max-Sum to be transformed into a highly competitive self-contained algorithm, in which a feasible solution is given at each step of the iterative procedure. Part of this development participated in the 2014 DIMACS Challenge on Steiner problems, and we report the results here. The performance on the challenge of the proposed approach was highly satisfactory: it maintained a small gap to the best bound in most cases, and obtained the best results on several instances in two different categories. We also present several improvements with respect to the version of the algorithm that participated in the competition, including new best solutions for some of the instances of the challenge.

  4. Method of analysis and quality-assurance practices by the U.S. Geological Survey Organic Geochemistry Research Group; determination of geosmin and methylisoborneol in water using solid-phase microextraction and gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Zimmerman, L.R.; Ziegler, A.C.; Thurman, E.M.

    2002-01-01

    A method for the determination of two common odor-causing compounds in water, geosmin and 2-methylisoborneol, was modified and verified by the U.S. Geological Survey's Organic Geochemistry Research Group in Lawrence, Kansas. The optimized method involves the extraction of odor-causing compounds from filtered water samples using a divinylbenzene-carboxen-polydimethylsiloxane cross-link coated solid-phase microextraction (SPME) fiber. Detection of the compounds is accomplished using capillary-column gas chromatography/mass spectrometry (GC/MS). Precision and accuracy were demonstrated using reagent-water, surface-water, and ground-water samples. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 35 nanograms per liter ranged from 60 to 123 percent for geosmin and from 90 to 96 percent for 2-methylisoborneol. Method detection limits were 1.9 nanograms per liter for geosmin and 2.0 nanograms per liter for 2-methylisoborneol in 45-milliliter samples. Typically, concentrations of 30 and 10 nanograms per liter of geosmin and 2-methylisoborneol, respectively, can be detected by the general public. The calibration range for the method is equivalent to concentrations from 5 to 100 nanograms per liter without dilution. The method is valuable for acquiring information about the production and fate of these odor-causing compounds in water.

  5. Professional Suitability for Social Work Practice: A Factor Analysis

    ERIC Educational Resources Information Center

    Tam, Dora M. Y.; Coleman, Heather; Boey, Kam-Wing

    2012-01-01

    Objective: The purpose of this study was to identify the underlying dimensions of professional suitability. Method: Data were collected from a province-wide mail-out questionnaire surveying 341 participants from a random sample of registered social workers. Results: The use of an exploratory factor analysis identified a 5-factor solution on…

  6. Dynamic mechanical analysis: A practical introduction to techniques and applications

    SciTech Connect

    Menard, K.

    1999-01-01

    This book introduces DMA, its history, and its current position as part of thermal analysis on polymers. It discusses major types of instrumentation, including oscillatory rotational, oscillatory axial, and torsional pendulum. It also describes analytical techniques in terms of utility, quality of data, methods of calibration, and suitability for different types of materials and assesses applications for thermoplastics, thermosetting systems, and thermosets.

  7. Perceived Barriers and Facilitators to School Social Work Practice: A Mixed-Methods Study

    ERIC Educational Resources Information Center

    Teasley, Martell; Canifield, James P.; Archuleta, Adrian J.; Crutchfield, Jandel; Chavis, Annie McCullough

    2012-01-01

    Understanding barriers to practice is a growing area within school social work research. Using a convenience sample of 284 school social workers, this study replicates the efforts of a mixed-method investigation designed to identify barriers and facilitators to school social work practice within different geographic locations. Time constraints and…

  8. Autoethnography as a Method for Reflexive Research and Practice in Vocational Psychology

    ERIC Educational Resources Information Center

    McIlveen, Peter

    2008-01-01

    This paper overviews the qualitative research method of autoethnography and its relevance to research in vocational psychology and practice in career development. Autoethnography is a reflexive means by which the researcher-practitioner consciously embeds himself or herself in theory and practice, and by way of intimate autobiographic account,…

  9. Cross-Continental Reflections on Evaluation Practice: Methods, Use, and Valuing

    ERIC Educational Resources Information Center

    Kallemeyn, Leanne M.; Hall, Jori; Friche, Nanna; McReynolds, Clifton

    2015-01-01

    The evaluation theory tree typology reflects the following three components of evaluation practice: (a) methods, (b) use, and (c) valuing. The purpose of this study was to explore how evaluation practice is conceived as reflected in articles published in the "American Journal of Evaluation" ("AJE") and "Evaluation," a…

  10. Methods for Analysis of Outdoor Performance Data (Presentation)

    SciTech Connect

    Jordan, D.

    2011-02-01

    The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted to power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders--utility companies, integrators, investors, and scientists alike. Different methods to determine degradation rates and discrete versus continuous data are presented, and some general best practice methods are outlined. In addition, historical degradation rates and some preliminary analysis with respect to climate are given.

  11. Low hardness organisms: Culture methods, sensitivities, and practical applications

    SciTech Connect

    DaCruz, A.; DaCruz, N.; Bird, M.

    1995-12-31

    EPA Regulations require biomonitoring of permitted effluent and stormwater runoff. Several permit locations were studied, in Virginia, that have supply water and or stormwater runoff which ranges in hardness from 5--30 mg/L. Ceriodaphnia dubia (dubia) and Pimephales promelas (fathead minnow) were tested in reconstituted water with hardnesses from 5--30 mg/L. Results indicated osmotic stresses present in the acute tests with the fathead minnow as well as chronic tests for the dubia and the fathead minnow. Culture methods were developed for both organism types in soft (30 mg) reconstituted freshwater. Reproductivity and development for each organisms type meets or exceeds EPA testing requirements for moderately hard organisms. Sensitivities were measured over an 18 month interval using cadmium chloride as a reference toxicant. Additionally, sensitivities were charted in contrast with those of organisms cultured in moderately hard water. The comparison proved that the sensitivities of both the dubia and the fathead minnow cultured in 30 mg water increased, but were within two standard deviations of the organism sensitivities of those cultured in moderately hard water. Latitude for use of organisms cultured in 30 mg was documented for waters ranging in hardness from 10--100 mg/L with no acclimation period required. The stability of the organism sensitivity was also validated. The application was most helpful in stormwater runoff and in effluents where the hardness was 30 mg/L or less.

  12. A mixed methods exploration of the team and organizational factors that may predict new graduate nurse engagement in collaborative practice.

    PubMed

    Pfaff, Kathryn A; Baxter, Pamela E; Ploeg, Jenny; Jack, Susan M

    2014-03-01

    Although engagement in collaborative practice is reported to support the role transition and retention of new graduate (NG) nurses, it is not known how to promote collaborative practice among these nurses. This mixed methods study explored the team and organizational factors that may predict NG nurse engagement in collaborative practice. A total of 514 NG nurses from Ontario, Canada completed the Collaborative Practice Assessment Tool. Sixteen NG nurses participated in follow-up interviews. The team and organizational predictors of NG engagement in collaborative practice were as follows: satisfaction with the team (β = 0.278; p = 0.000), number of team strategies (β = 0.338; p = 0.000), participation in a mentorship or preceptorship experience (β = 0.137; p = 0.000), accessibility of manager (β = 0.123; p = 0.001), and accessibility and proximity of educator or professional practice leader (β = 0.126; p = 0.001 and β = 0.121; p = 0.002, respectively). Qualitative analysis revealed the team facilitators to be respect, team support and face-to-face interprofessional interactions. Organizational facilitators included supportive leadership, participation in a preceptorship or mentorship experience and time. Interventions designed to facilitate NG engagement in collaborative practice should consider these factors. PMID:24195680

  13. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2002-01-01

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  14. Measuring solar reflectance - Part II: Review of practical methods

    SciTech Connect

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul

    2010-09-15

    A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23 ], and to within 0.02 for surface slopes up to 12:12 [45 ]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R{sub g,0}{sup *}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R{sub g,0}{sup *} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R{sub g,0}{sup *} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R{sub g,0}{sup *} to within about 0.01. (author)

  15. Measuring solar reflectance Part II: Review of practical methods

    SciTech Connect

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul

    2010-05-14

    A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23{sup o}], and to within 0.02 for surface slopes up to 12:12 [45{sup o}]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R*{sub g,0}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R*{sub g,0} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R*{sub g,0} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R*{sub g,0} to within about 0.01.

  16. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  17. Skill analysis part 3: improving a practice skill.

    PubMed

    Price, Bob

    In this, the third and final article in a series on practice skill analysis, attention is given to imaginative ways of improving a practice skill. Having analysed and evaluated a chosen skill in the previous two articles, it is time to look at new ways to proceed. Creative people are able to be analytical and imaginative. The process of careful reasoning involved in analysing and evaluating a skill will not necessarily be used to improve it. To advance a skill, there is a need to engage in more imaginative, free-thinking processes that allow the nurse to think afresh about his or her chosen skill. Suggestions shared in this article are not exhaustive, but the material presented does illustrate measures that in the author's experience seem to have potential. Consideration is given to how the improved skill might be envisaged (an ideal skill in use). The article is illustrated using the case study of empathetic listening, which has been used throughout this series. PMID:22356066

  18. A practical method of estimating standard error of age in the fission track dating method

    USGS Publications Warehouse

    Johnson, N.M.; McGee, V.E.; Naeser, C.W.

    1979-01-01

    A first-order approximation formula for the propagation of error in the fission track age equation is given by PA = C[P2s+P2i+P2??-2rPsPi] 1 2, where PA, Ps, Pi and P?? are the percentage error of age, of spontaneous track density, of induced track density, and of neutron dose, respectively, and C is a constant. The correlation, r, between spontaneous are induced track densities is a crucial element in the error analysis, acting generally to improve the standard error of age. In addition, the correlation parameter r is instrumental is specifying the level of neutron dose, a controlled variable, which will minimize the standard error of age. The results from the approximation equation agree closely with the results from an independent statistical model for the propagation of errors in the fission-track dating method. ?? 1979.

  19. Practice.

    PubMed

    Chambers, David W

    2008-01-01

    Practice refers to a characteristic way professionals use common standards to customize solutions to a range of problems. Practice includes (a) standards for outcomes and processes that are shared with one's colleagues, (b) a rich repertoire of skills grounded in diagnostic acumen, (c) an ability to see the actual and the ideal and work back and forth between them, (d) functional artistry, and (e) learning by doing that transcends scientific rationality. Communities of practice, such as dental offices, are small groups that work together in interlocking roles to achieve these ends. PMID:19413050

  20. Recruitment ad analysis offers new opportunities to attract GPs to short-staffed practices.

    PubMed

    Hemphill, Elizabeth; Kulik, Carol T

    2013-01-01

    As baby-boomer practitioners exit the workforce, physician shortages present new recruitment challenges for practices seeking GPs. This article reports findings from two studies examining GP recruitment practice. GP recruitment ad content analysis (Study 1) demonstrated that both Internet and print ads emphasize job attributes but rarely present family or practice attributes. Contacts at these medical practices reported that their practices offer distinctive family and practice attributes that could be exploited in recruitment advertising (Study 2). Understaffed medical practices seeking to attract GPs may differentiate their job offerings in a crowded market by incorporating family and/or practice attributes into their ads. PMID:23697854

  1. Method and apparatus for ceramic analysis

    DOEpatents

    Jankowiak, Ryszard J.; Schilling, Chris; Small, Gerald J.; Tomasik, Piotr

    2003-04-01

    The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.

  2. Transonic wing analysis using advanced computational methods

    NASA Technical Reports Server (NTRS)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    This paper discusses the application of three-dimensional computational transonic flow methods to several different types of transport wing designs. The purpose of these applications is to evaluate the basic accuracy and limitations associated with such numerical methods. The use of such computational methods for practical engineering problems can only be justified after favorable evaluations are completed. The paper summarizes a study of both the small-disturbance and the full potential technique for computing three-dimensional transonic flows. Computed three-dimensional results are compared to both experimental measurements and theoretical results. Comparisons are made not only of pressure distributions but also of lift and drag forces. Transonic drag rise characteristics are compared. Three-dimensional pressure distributions and aerodynamic forces, computed from the full potential solution, compare reasonably well with experimental results for a wide range of configurations and flow conditions.

  3. [Analysis of an intercultural clinical practice in a judicial setting].

    PubMed

    Govindama, Yolande

    2007-01-01

    This article analyses an intercultural clinical practice in a legal setting from an anthropological and psychoanalytical perspective, demonstrating necessary reorganizations inherent to the framework. The culture of the new country and its founding myth being implicit to the judicial framework, the professional intervening introduces psychoanalytical references particularly totemic principles and the symbolic father by making genealogy, a universal object of transmission as guarantee of fundamental taboos of humanity. The metacultural perspective in this approach integrates ethnopsychoanalytical principles put forth by Devereux as well as the method although this latter has been adapted to the framework. This approach allows to re-question Devereux's ethnopsychoanalytical principles by opening the debate on the perspective of a psychoanalytical as well as psychiatric. PMID:18253668

  4. Short communication: Practical issues in implementing volatile metabolite analysis for identifying mastitis pathogens.

    PubMed

    Hettinga, Kasper A; de Bok, Frank A M; Lam, Theo J G M

    2015-11-01

    Several parameters for improving volatile metabolite analysis using headspace gas chromatography-mass spectrometry (GC-MS) analysis of volatile metabolites were evaluated in the framework of identification of mastitis-causing pathogens. Previous research showed that the results of such volatile metabolites analysis were comparable with those based on bacteriological culturing. The aim of this study was to evaluate the effect of several method changes on the applicability and potential implementation of this method in practice. The use of a relatively polar column is advantageous, resulting in a faster and less complex chromatographic setup with a higher resolving power yielding higher-quality data. Before volatile metabolite analysis is applied, a minimum incubation of 8h is advised, as reducing incubation time leads to less reliable pathogen identification. Application of GC-MS remained favorable compared with regular gas chromatography. The complexity and cost of a GC-MS system are such that this limits the application of the method in practice for identification of mastitis-causing pathogens. PMID:26342985

  5. Comparison of three evidence-based practice learning assessment methods in dental curricula.

    PubMed

    Al-Ansari, Asim A; El Tantawi, Maha M A

    2015-02-01

    Incorporating evidence-based practice (EBP) training in dental curricula is now an accreditation requirement for dental schools, but questions remain about the most effective ways to assess learning outcomes. The purpose of this study was to evaluate and compare three assessment methods for EBP training and to assess their relation to students' overall course grades. Participants in the study were dental students from two classes who received training in appraising randomized controlled trials (RCTs) and systematic reviews in 2013 at the University of Dammam, Saudi Arabia. Repeated measures analysis of variance was used to compare students' scores on appraisal assignments, scores on multiple-choice question (MCQ) exams in which EBP concepts were applied to clinical scenarios, and scores for self-reported efficacy in appraisal. Regression analysis was used to assess the relationship among the three assessment methods, gender, program level, and overall grade. The instructors had acceptable reliability in scoring the assignments (overall intraclass correlation coefficient=0.60). The MCQ exams had acceptable discrimination indices although their reliability was less satisfactory (Cronbach's alpha=0.46). Statistically significant differences were observed among the three methods with MCQ exams having the lowest overall scores. Variation in the overall course grades was explained by scores on the appraisal assignment and MCQ exams (partial eta-squared=0.52 and 0.24, respectively), whereas score on the self-efficacy questionnaire was not significantly associated with overall grade. The results suggest that self-reported efficacy is not a valid method to assess dental students' RCT appraisal skills, whereas instructor-graded appraisal assignments explained a greater portion of variation in grade and had inherent validity and acceptable consistency and MCQ exams had good construct validity but low internal consistency. PMID:25640619

  6. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2004-03-23

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  7. Advanced reliability method for fatigue analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Wirsching, P. H.

    1984-01-01

    When design factors are considered as random variables and the failure condition cannot be expressed by a closed form algebraic inequality, computations of risk (or probability of failure) may become extremely difficult or very inefficient. This study suggests using a simple and easily constructed second degree polynomial to approximate the complicated limit state in the neighborhood of the design point; a computer analysis relates the design variables at selected points. Then a fast probability integration technique (i.e., the Rackwitz-Fiessler algorithm) can be used to estimate risk. The capability of the proposed method is demonstrated in an example of a low cycle fatigue problem for which a computer analysis is required to perform local strain analysis to relate the design variables. A comparison of the performance of this method is made with a far more costly Monte Carlo solution. Agreement of the proposed method with Monte Carlo is considered to be good.

  8. Advanced analysis methods in particle physics

    SciTech Connect

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  9. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2009-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  10. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2010-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.