Science.gov

Sample records for practical analysis method

  1. A Practical Guide to Modern Methods of Meta-Analysis.

    ERIC Educational Resources Information Center

    Hedges, Larry V.; And Others

    Methods for meta-analysis have evolved dramatically since Gene Glass first proposed the term in 1976. Since that time statistical and nonstatistical aspects of methodology for meta-analysis have been developing at a steady pace. This guide is an attempt to provide a practical introduction to rigorous procedures in the meta-analysis of social…

  2. A practical method for the analysis of meteor spectra

    NASA Astrophysics Data System (ADS)

    Dubs, Martin; Schlatter, Peter

    2015-08-01

    The analysis of meteor spectra (photographic, CCD or video recording) is complicated by the fact that spectra obtained with objective gratings are curved and have a nonlinear dispersion. In this paper it is shown that with a simple image transformation the spectra can be linearized in such a way that individual spectra over the whole image plane are parallel and have a constant, linear dispersion. This simplifies the identification and measurement of meteor spectral lines. A practical method is given to determine the required image transformation.

  3. A Practical Method of Policy Analysis by Simulating Policy Options

    ERIC Educational Resources Information Center

    Phelps, James L.

    2011-01-01

    This article focuses on a method of policy analysis that has evolved from the previous articles in this issue. The first section, "Toward a Theory of Educational Production," identifies concepts from science and achievement production to be incorporated into this policy analysis method. Building on Kuhn's (1970) discussion regarding paradigms, the…

  4. Practical Use of Computationally Frugal Model Analysis Methods.

    PubMed

    Hill, Mary C; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2016-03-01

    Three challenges compromise the utility of mathematical models of groundwater and other environmental systems: (1) a dizzying array of model analysis methods and metrics make it difficult to compare evaluations of model adequacy, sensitivity, and uncertainty; (2) the high computational demands of many popular model analysis methods (requiring 1000's, 10,000 s, or more model runs) make them difficult to apply to complex models; and (3) many models are plagued by unrealistic nonlinearities arising from the numerical model formulation and implementation. This study proposes a strategy to address these challenges through a careful combination of model analysis and implementation methods. In this strategy, computationally frugal model analysis methods (often requiring a few dozen parallelizable model runs) play a major role, and computationally demanding methods are used for problems where (relatively) inexpensive diagnostics suggest the frugal methods are unreliable. We also argue in favor of detecting and, where possible, eliminating unrealistic model nonlinearities-this increases the realism of the model itself and facilitates the application of frugal methods. Literature examples are used to demonstrate the use of frugal methods and associated diagnostics. We suggest that the strategy proposed in this paper would allow the environmental sciences community to achieve greater transparency and falsifiability of environmental models, and obtain greater scientific insight from ongoing and future modeling efforts. PMID:25810333

  5. A Practical Method of Policy Analysis by Estimating Effect Size

    ERIC Educational Resources Information Center

    Phelps, James L.

    2011-01-01

    The previous articles on class size and other productivity research paint a complex and confusing picture of the relationship between policy variables and student achievement. Missing is a conceptual scheme capable of combining the seemingly unrelated research and dissimilar estimates of effect size into a unified structure for policy analysis and…

  6. A Practical Method of Policy Analysis by Estimating Effect Size

    ERIC Educational Resources Information Center

    Phelps, James L.

    2011-01-01

    The previous articles on class size and other productivity research paint a complex and confusing picture of the relationship between policy variables and student achievement. Missing is a conceptual scheme capable of combining the seemingly unrelated research and dissimilar estimates of effect size into a unified structure for policy analysis and

  7. A Topography Analysis Incorporated Optimization Method for the Selection and Placement of Best Management Practices

    PubMed Central

    Shen, Zhenyao; Chen, Lei; Xu, Liang

    2013-01-01

    Best Management Practices (BMPs) are one of the most effective methods to control nonpoint source (NPS) pollution at a watershed scale. In this paper, the use of a topography analysis incorporated optimization method (TAIOM) was proposed, which integrates topography analysis with cost-effective optimization. The surface status, slope and the type of land use were evaluated as inputs for the optimization engine. A genetic algorithm program was coded to obtain the final optimization. The TAIOM was validated in conjunction with the Soil and Water Assessment Tool (SWAT) in the Yulin watershed in Southwestern China. The results showed that the TAIOM was more cost-effective than traditional optimization methods. The distribution of selected BMPs throughout landscapes comprising relatively flat plains and gentle slopes, suggests the need for a more operationally effective scheme, such as the TAIOM, to determine the practicability of BMPs before widespread adoption. The TAIOM developed in this study can easily be extended to other watersheds to help decision makers control NPS pollution. PMID:23349917

  8. Practical method for radioactivity distribution analysis in small-animal PET cancer studies.

    PubMed

    Slavine, Nikolai V; Antich, Peter P

    2008-12-01

    We present a practical method for radioactivity distribution analysis in small-animal tumors and organs using positron emission tomography imaging with a calibrated source of known activity and size in the field of view. We reconstruct the imaged mouse together with a source under the same conditions, using an iterative method, Maximum likelihood expectation-maximization with system modeling, capable of delivering high-resolution images. Corrections for the ratios of geometrical efficiencies, radioisotope decay in time and photon attenuation are included in the algorithm. We demonstrate reconstruction results for the amount of radioactivity within the scanned mouse in a sample study of osteolytic and osteoblastic bone metastasis from prostate cancer xenografts. Data acquisition was performed on the small-animal PET system, which was tested with different radioactive sources, phantoms and animals to achieve high sensitivity and spatial resolution. Our method uses high-resolution images to determine the volume of organ or tumor and the amount of their radioactivity has the possibility of saving time, effort and the necessity to sacrifice animals. This method has utility for prognosis and quantitative analysis in small-animal cancer studies, and will enhance the assessment of characteristics of tumor growth, identifying metastases, and potentially determining the effectiveness of cancer treatment. The possible application for this technique could be useful for the organ radioactivity dosimetry studies. PMID:18667322

  9. Methods and practices used in incident analysis in the Finnish nuclear power industry.

    PubMed

    Suksi, Seija

    2004-07-26

    According to the Finnish Nuclear Energy Act it is licensee's responsibility to ensure safe use of nuclear energy. Radiation and Nuclear Safety Authority (STUK) is the regulatory body responsible for the state supervision of the safe use of nuclear power in Finland. One essential prerequisite for the safe and reliable operation of nuclear power plants is that lessons are learned from the operational experience. It is utility's prime responsibility to assess the operational events and implement appropriate corrective actions. STUK controls licensees' operational experience feedback arrangements and implementation as part of its inspection activities. In addition to this in Finland, the regulatory body performs its own assessment of the operational experience. Review and investigation of operational events is a part of the regulatory oversight of operational safety. Review of operational events is done by STUK basically at three different levels. First step is to perform a general review of all operational events, transients and reactor scram reports, which the licensees submit for information to STUK. The second level activities are related to the clarification of events at site and entering of events' specific data into the event register database of STUK. This is done for events which meet the set criteria for the operator to submit a special report to STUK for approval. Safety significance of operational events is determined using probabilistic safety assessment (PSA) techniques. Risk significance of events and the number of safety significant events are followed by STUK indicators. The final step in operational event assessment performed by STUK is to assign STUK's own investigation team for events deemed to have special importance, especially when the licensee's organisation has not operated as planned. STUK launches its own detail investigation once a year on average. An analysis and evaluation of event investigation methods applied at STUK, and at the two Finnish nuclear power plant operators Teollisuuden Voima Oy (TVO) and Fortum Power and Heat Oy (Fortum) was carried out by the Technical Research Centre (VTT) on request of STUK at the end of 1990s. The study aimed at providing a broad overview and suggestions for improvement of the whole organisational framework to support event investigation practices at the regulatory body and at the utilities. The main objective of the research was to evaluate the adequacy and reliability of event investigation analysis methods and practices in the Finnish nuclear power industry and based on the results to further develop them. The results and suggestions of the research are reviewed in the paper and the corrective actions implemented in event investigation and operating experience procedures both at STUK and at utilities are discussed as well. STUK has developed its own procedure for the risk-informed analysis of nuclear power plant events. The PSA based event analysis method is used to assess the safety significance and importance measures associated with the unavailability of components and systems subject to Technical Specifications. The insights from recently performed PSA based analyses are also briefly discussed in the paper. PMID:15231350

  10. Comparing Different Methods for Implementing Parallel Analysis: A Practical Index of Accuracy.

    ERIC Educational Resources Information Center

    Cota, Albert A.; And Others

    1993-01-01

    Accuracy was compared for three methods of implementing parallel analysis with mean eigenvalues (regression, interpolation, and computation with three samples of random data). The accuracy of parallel analysis with 95th percentile eigenvalues (through regression and interpolation) was also considered. No evidence of differential accuracy emerged…

  11. [The method of analysis of distribution of erythrocytes by density: practical guidelines].

    PubMed

    Shukrina, E S; Nesterenko, V M; Tsvetaeva, N V; Nikulina, O F; Ataullakhanov, F I

    2014-07-01

    The article describes the phthalate method of analysis of distribution of erythrocytes by density and demonstrates its possibility. The distribution of erythrocytes by density is implemented using centrifugation of blood in micro-hematocrit capillaries in presence of compounds of dimethyl- and dibuthylphthalates of known density. The acquisition of such clinically reliable parameters of distribution of erythrocytes by density as mean density of erythrocytes, width of distribution of erythrocytes by density, light and heavy fraction of erythrocytes and maximum of curve of distribution of erythrocytes by density is described. The causes of deviation of distribution of erythrocytes by density from standard values under various pathological conditions are considered. The syndrome of dehydration of erythrocytes is described in details. The simple and accessible method of acquisition of distribution of erythrocytes by density is described. It is demonstrated that analysis of distribution of erythrocytes by density makes it possible to determine character of changes occurring with erythrocytes. The monitoring of parameters of distribution of erythrocytes by density allows evaluating dynamics of pathological process and effectiveness of therapy. PMID:25346987

  12. Analysis of the upper massif of the craniofacial with the radial methodpractical use

    PubMed Central

    Lepich, Tomasz; Dąbek, Józefa; Stompel, Daniel; Gielecki, Jerzy S.

    2011-01-01

    Introduction The analysis of the upper massif of the craniofacial (UMC) is widely used in many fields of science. The aim of the study was to create a high resolution computer system based on a digital information record and on vector graphics, that could enable dimension measuring and evaluation of craniofacial shape using the radial method. Material and methods The study was carried out on 184 skulls, in a good state of preservation, from the early middle ages. The examined skulls were fixed into Molisson's craniostat in the author's own modification. They were directed in space towards the Frankfurt plane and photographed in frontal norm with a digital camera. The parameters describing the plane and dimensional structure of the UMC and orbits were obtained thanks to the computer analysis of the function recordings picturing the craniofacial structures and using software combining raster graphics with vector graphics. Results It was compared mean values of both orbits separately for male and female groups. In female skulls the comparison of the left and right side did not show statistically significant differences. In male group, higher values were observed for the right side. Only the circularity index presented higher values for the left side. Conclusions Computer graphics with the software used for analysing digital pictures of UMC and orbits increase the precision of measurements as well as the calculation possibilities. Recognition of the face in the post mortem examination is crucial for those working on identification in anthropology and criminology laboratories. PMID:22291834

  13. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory?Determination of Trihalomethane Formation Potential, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Fram, Miranda S.; Bush, Noel

    2004-01-01

    An analytical method for the determination of the trihalomethane formation potential of water samples has been developed. The trihalomethane formation potential is measured by dosing samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine, and then analyzing the resulting trihalomethanes by purge and trap/gas chromatography equipped with an electron capture detector. Detailed explanations of the method and quality-control practices are provided. Method validation experiments showed that the trihalomethane formation potential varies as a function of time between sample collection and analysis, residual-free chlorine concentration, method of sample dilution, and the concentration of bromide in the sample.

  14. ISO 14 001 at the farm level: analysis of five methods for evaluating the environmental impact of agricultural practices.

    PubMed

    Galan, M B; Peschard, D; Boizard, H

    2007-02-01

    Faced with society's increasing expectations, the Common Agricultural Policy (CAP) review considers environmental management to be an ever more critical criterion in the allocation of farm subsidies. With the goal of evaluating the environmental friendliness of farm practices, France's agricultural research and extension services have built a range of agricultural/environmental diagnostic tools over recent years. The objective of the present paper is to compare the five tools most frequently used in France: IDEA, DIAGE, DIALECTE, DIALOGUE and INDIGO. All the tools have the same purpose: evaluation of the impact of farm practices on the environment via indicators and monitoring of farm management practices. When tested on a sample of large-scale farms in Picardie, the five tools sometimes produced completely different results: for a given farm, the most supposedly significant environmental impacts depend on the tool used. These results lead to differing environmental management plans and raise the question of the methods' pertinence. An analysis grid of diagnostic tools aimed at specifying their field of validity, limits and relevance was drawn up. The resulting comparative analysis enables to define each tool's domain of validity and allows to suggest lines of thought for developing more relevant tools for (i) evaluating a farm's environmental performance and (ii) helping farmers to develop a plan for improving practices within the framework of an environmental management system. PMID:17084504

  15. Animal Disease Import Risk Analysis--a Review of Current Methods and Practice.

    PubMed

    Peeler, E J; Reese, R A; Thrush, M A

    2015-10-01

    The application of risk analysis to the spread of disease with international trade in animals and their products, that is, import risk analysis (IRA), has been largely driven by the Sanitary and Phytosanitary (SPS) agreement of the World Trade Organization (WTO). The degree to which the IRA standard established by the World Organization for Animal Health (OIE), and associated guidance, meets the needs of the SPS agreement is discussed. The use of scenario trees is the core modelling approach used to represent the steps necessary for the hazard to occur. There is scope to elaborate scenario trees for commodity IRA so that the quantity of hazard at each step is assessed, which is crucial to the likelihood of establishment. The dependence between exposure and establishment suggests that they should fall within the same subcomponent. IRA undertaken for trade reasons must include an assessment of consequences to meet SPS criteria, but guidance is sparse. The integration of epidemiological and economic modelling may open a path for better methods. Matrices have been used in qualitative IRA to combine estimates of entry and exposure, and consequences with likelihood, but this approach has flaws and better methods are needed. OIE IRA standards and guidance indicate that the volume of trade should be taken into account, but offer no detail. Some published qualitative IRAs have assumed current levels and patterns of trade without specifying the volume of trade, which constrains the use of IRA to determine mitigation measures (to reduce risk to an acceptable level) and whether the principle of equivalence, fundamental to the SPS agreement, has been observed. It is questionable whether qualitative IRA can meet all the criteria set out in the SPS agreement. Nevertheless, scope exists to elaborate the current standards and guidance, so they better serve the principle of science-based decision-making. PMID:24237667

  16. Empowering Discourse: Discourse Analysis as Method and Practice in the Sociology Classroom

    ERIC Educational Resources Information Center

    Hjelm, Titus

    2013-01-01

    Collaborative learning and critical pedagogy are widely recognized as "empowering" pedagogies for higher education. Yet, the practical implementation of both has a mixed record. The question, then, is: How could collaborative and critical pedagogies be empowered themselves? This paper makes a primarily theoretical case for discourse…

  17. Evaluating the clinical appropriateness of nurses' prescribing practice: method development and findings from an expert panel analysis

    PubMed Central

    Latter, Sue; Maben, Jill; Myall, Michelle; Young, Amanda

    2007-01-01

    Background The number of nurses independently prescribing medicines in England is rising steadily. There had been no attempt systematically to evaluate the clinical appropriateness of nurses' prescribing decisions. Aims (i) To establish a method of assessing the clinical appropriateness of nurses' prescribing decisions; (ii) to evaluate the prescribing decisions of a sample of nurses, using this method. Method A modified version of the Medication Appropriateness Index (MAI) was developed, piloted and subsequently used by seven medical prescribing experts to rate transcripts of 12 nurse prescriber consultations selected from a larger database of 118 audio‐recorded consultations collected as part of a national evaluation. Experts were also able to give written qualitative comments on each of the MAI dimensions applied to each of the consultations. Analysis Experts' ratings were analysed using descriptive statistics. Qualitative comments were subjected to a process of content analysis to identify themes within and across both MAI items and consultations. Results Experts' application of the modified MAI to transcripts of nurse prescriber consultations demonstrated validity and feasibility as a method of assessing the clinical appropriateness of nurses' prescribing decisions. In the majority of assessments made by the expert panel, nurses' prescribing decisions were rated as clinically appropriate on all nine items in the MAI. Conclusion A valid and feasible method of assessing the clinical appropriateness of nurses' prescribing practice has been developed using a modified MAI and transcripts of audio‐recorded consultations sent to a panel of prescribing experts. Prescribing nurses in this study were generally considered to be making clinically appropriate prescribing decisions. This approach to measuring prescribing appropriateness could be used as part of quality assurance in routine practice, as a method of identifying continuing professional development needs, or in future research as the expansion of non‐medical prescribing continues. PMID:18055884

  18. APA's Learning Objectives for Research Methods and Statistics in Practice: A Multimethod Analysis

    ERIC Educational Resources Information Center

    Tomcho, Thomas J.; Rice, Diana; Foels, Rob; Folmsbee, Leah; Vladescu, Jason; Lissman, Rachel; Matulewicz, Ryan; Bopp, Kara

    2009-01-01

    Research methods and statistics courses constitute a core undergraduate psychology requirement. We analyzed course syllabi and faculty self-reported coverage of both research methods and statistics course learning objectives to assess the concordance with APA's learning objectives (American Psychological Association, 2007). We obtained a sample of…

  19. Practical Thermal Evaluation Methods For HAC Fire Analysis In Type B Radiaoactive Material (RAM) Packages

    SciTech Connect

    Abramczyk, Glenn; Hensel, Stephen J; Gupta, Narendra K.

    2013-03-28

    Title 10 of the United States Code of Federal Regulations Part 71 for the Nuclear Regulatory Commission (10 CFR Part 71.73) requires that Type B radioactive material (RAM) packages satisfy certain Hypothetical Accident Conditions (HAC) thermal design requirements to ensure package safety during accidental fire conditions. Compliance with thermal design requirements can be met by prototype tests, analyses only or a combination of tests and analyses. Normally, it is impractical to meet all the HAC using tests only and the analytical methods are too complex due to the multi-physics non-linear nature of the fire event. Therefore, a combination of tests and thermal analyses methods using commercial heat transfer software are used to meet the necessary design requirements. The authors, along with his other colleagues at Savannah River National Laboratory in Aiken, SC, USA, have successfully used this 'tests and analyses' approach in the design and certification of several United States' DOE/NNSA certified packages, e.g. 9975, 9977, 9978, 9979, H1700, and Bulk Tritium Shipping Package (BTSP). This paper will describe these methods and it is hoped that the RAM Type B package designers and analysts can use them for their applications.

  20. Genetic counseling practice analysis.

    PubMed

    Hampel, Heather; Grubs, Robin E; Walton, Carol S; Nguyen, Emma; Breidenbach, Daniel H; Nettles, Steve; Callanan, Nancy; Corliss, Meagan; Fox, Stephanie; Hiraki, Susan; Ku, Lisa; Neufeld-Kaiser, Whitney; Riley, Bronson; Taylor, Jamie; Weik, LuAnn

    2009-06-01

    The American Board of Genetic Counseling (ABGC) performed a genetic counseling practice analysis (PA) to determine the content of the certification examination. The ABGC-appointed PA Advisory Committee worked with psychometricians to develop a survey which was distributed to 2,038 genetic counselors in the United States and Canada. The survey was also accessible on the ABGC website. Multiple criteria were used to establish the significance of the tasks included in the survey. A total of 677 responses were used in the analysis, representing a 37.1% corrected response rate. Five major content domains with 143 tasks were identified in the PA. New certification test specifications were developed on the basis of PA results and will be used in developing future examination forms. In keeping with credentialing standards, ABGC plans to conduct a PA on a regular basis so that the content of the examination reflects current practice. PMID:19277852

  1. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: practical guidelines for the crystalline sponge method

    SciTech Connect

    Ramadhar, Timothy R.; Zheng, Shao-Liang; Chen, Yu-Sheng

    2015-01-01

    This report describes complete practical guidelines and insights for the crystalline sponge method, which have been derived through the first use of synchrotron radiation on these systems, and includes a procedure for faster synthesis of the sponges. These guidelines will be applicable to crystal sponge data collected at synchrotrons or in-house facilities, and will allow researchers to obtain reliable high-quality data and construct chemically and physically sensible models for guest structural determination. A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data, and will allow construction of chemically and physically sensible models for guest structural determination.

  2. A Critical Analysis of SocINDEX and Sociological Abstracts Using an Evaluation Method for the Practicing Bibliographer

    ERIC Educational Resources Information Center

    Mellone, James T.

    2010-01-01

    This study provides a database evaluation method for the practicing bibliographer that is more than a brief review yet less than a controlled experiment. The author establishes evaluation criteria in the context of the bibliographic instruction provided to meet the research requirements of undergraduate sociology majors at Queens College, City…

  3. Insight into Evaluation Practice: A Content Analysis of Designs and Methods Used in Evaluation Studies Published in North American Evaluation-Focused Journals

    ERIC Educational Resources Information Center

    Christie, Christina A.; Fleischer, Dreolin Nesbitt

    2010-01-01

    To describe the recent practice of evaluation, specifically method and design choices, the authors performed a content analysis on 117 evaluation studies published in eight North American evaluation-focused journals for a 3-year period (2004-2006). The authors chose this time span because it follows the scientifically based research (SBR)…

  4. The gap between legal rules and practice in advertising non-registered pharmaceutical products. A new method of analysis.

    PubMed

    Wieringa, N F; de Meijer, A H; Schutjens, M D; Vos, R

    1992-12-01

    The market of non-registered pharmaceutical products is growing fast in number and overall costs, not only in the Netherlands, but also in other European countries. These products often give the impression that the consumer may expect 'an effect as from a drug'. Legally, there is a clear distinction between 'drugs' and 'commodities' in the Netherlands; the question is whether legislation and practice concur. In an investigation we analysed texts of advertisements for non-registered pharmaceutical products published in a popular magazine. A method was developed, based on the legal definition of a drug and jurisprudence, to determine in a qualitative and quantitative way the application of medicinal claims. It transpired that in 65% of the analysed advertisements explicit or implicit claims were made. These products should therefore be subject to drugs legislation. Thus, in the Netherlands there is a gap between legislation and practice in advertising non-registered pharmaceutical products. PMID:1485197

  5. Qualitative Approaches to Mixed Methods Practice

    ERIC Educational Resources Information Center

    Hesse-Biber, Sharlene

    2010-01-01

    This article discusses how methodological practices can shape and limit how mixed methods is practiced and makes visible the current methodological assumptions embedded in mixed methods practice that can shut down a range of social inquiry. The article argues that there is a "methodological orthodoxy" in how mixed methods is practiced that…

  6. [Practical risk analysis].

    PubMed

    Lisbona, A; Valero, M

    2015-10-01

    Risk analysis is typically considered from two complementary points of view: predictive analysis performed prior, and retrospective analysis, which follows the internal reporting of adverse situations or malfunctions, both on the organizational and material or human aspects. The purpose of these additional analyzes is to ensure that planned or implemented measures allow to keep risks to a level deemed tolerable or acceptable at a given time and in a given situation. Where a risk is deemed unacceptable, risk reduction measures should be considered (prevention, limiting the consequences and protection). PMID:26362221

  7. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory-- Determination of Dissolved Organic Carbon in Water by High Temperature Catalytic Oxidation, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Bird, Susan M.; Fram, Miranda S.; Crepeau, Kathryn L.

    2003-01-01

    An analytical method has been developed for the determination of dissolved organic carbon concentration in water samples. This method includes the results of the tests used to validate the method and the quality-control practices used for dissolved organic carbon analysis. Prior to analysis, water samples are filtered to remove suspended particulate matter. A Shimadzu TOC-5000A Total Organic Carbon Analyzer in the nonpurgeable organic carbon mode is used to analyze the samples by high temperature catalytic oxidation. The analysis usually is completed within 48 hours of sample collection. The laboratory reporting level is 0.22 milligrams per liter.

  8. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: practical guidelines for the crystalline sponge method

    PubMed Central

    Ramadhar, Timothy R.; Zheng, Shao-Liang; Chen, Yu-Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data, and will allow construction of chemically and physically sensible models for guest structural determination. PMID:25537388

  9. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: practical guidelines for the crystalline sponge method.

    PubMed

    Ramadhar, Timothy R; Zheng, Shao Liang; Chen, Yu Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal-organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data, and will allow construction of chemically and physically sensible models for guest structural determination. PMID:25537388

  10. Systematic Review and Meta-Analysis of Studies Evaluating Diagnostic Test Accuracy: A Practical Review for Clinical Researchers-Part II. Statistical Methods of Meta-Analysis

    PubMed Central

    Lee, Juneyoung; Kim, Kyung Won; Choi, Sang Hyun; Huh, Jimi

    2015-01-01

    Meta-analysis of diagnostic test accuracy studies differs from the usual meta-analysis of therapeutic/interventional studies in that, it is required to simultaneously analyze a pair of two outcome measures such as sensitivity and specificity, instead of a single outcome. Since sensitivity and specificity are generally inversely correlated and could be affected by a threshold effect, more sophisticated statistical methods are required for the meta-analysis of diagnostic test accuracy. Hierarchical models including the bivariate model and the hierarchical summary receiver operating characteristic model are increasingly being accepted as standard methods for meta-analysis of diagnostic test accuracy studies. We provide a conceptual review of statistical methods currently used and recommended for meta-analysis of diagnostic test accuracy studies. This article could serve as a methodological reference for those who perform systematic review and meta-analysis of diagnostic test accuracy studies. PMID:26576107

  11. Practical method for analysis and design of slender reinforced concrete columns subjected to biaxial bending and axial loads

    NASA Astrophysics Data System (ADS)

    Bouzid, T.; Demagh, K.

    2011-03-01

    Reinforced and concrete-encased composite columns of arbitrarily shaped cross sections subjected to biaxial bending and axial loads are commonly used in many structures. For this purpose, an iterative numerical procedure for the strength analysis and design of short and slender reinforced concrete columns with a square cross section under biaxial bending and an axial load by using an EC2 stress-strain model is presented in this paper. The computational procedure takes into account the nonlinear behavior of the materials (i.e., concrete and reinforcing bars) and includes the second - order effects due to the additional eccentricity of the applied axial load by the Moment Magnification Method. The ability of the proposed method and its formulation has been tested by comparing its results with the experimental ones reported by some authors. This comparison has shown that a good degree of agreement and accuracy between the experimental and theoretical results have been obtained. An average ratio (proposed to test) of 1.06 with a deviation of 9% is achieved.

  12. Practical state of health estimation of power batteries based on Delphi method and grey relational grade analysis

    NASA Astrophysics Data System (ADS)

    Sun, Bingxiang; Jiang, Jiuchun; Zheng, Fangdan; Zhao, Wei; Liaw, Bor Yann; Ruan, Haijun; Han, Zhiqiang; Zhang, Weige

    2015-05-01

    The state of health (SOH) estimation is very critical to battery management system to ensure the safety and reliability of EV battery operation. Here, we used a unique hybrid approach to enable complex SOH estimations. The approach hybridizes the Delphi method known for its simplicity and effectiveness in applying weighting factors for complicated decision-making and the grey relational grade analysis (GRGA) for multi-factor optimization. Six critical factors were used in the consideration for SOH estimation: peak power at 30% state-of-charge (SOC), capacity, the voltage drop at 30% SOC with a C/3 pulse, the temperature rises at the end of discharge and charge at 1C; respectively, and the open circuit voltage at the end of charge after 1-h rest. The weighting of these factors for SOH estimation was scored by the 'experts' in the Delphi method, indicating the influencing power of each factor on SOH. The parameters for these factors expressing the battery state variations are optimized by GRGA. Eight battery cells were used to illustrate the principle and methodology to estimate the SOH by this hybrid approach, and the results were compared with those based on capacity and power capability. The contrast among different SOH estimations is discussed.

  13. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: Practical guidelines for the crystalline sponge method

    SciTech Connect

    Ramadhar, Timothy R.; Zheng, Shao -Liang; Chen, Yu -Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data, and will allow construction of chemically and physically sensible models for guest structural determination.

  14. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: Practical guidelines for the crystalline sponge method

    DOE PAGESBeta

    Ramadhar, Timothy R.; Zheng, Shao -Liang; Chen, Yu -Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collectionmore » times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data, and will allow construction of chemically and physically sensible models for guest structural determination.« less

  15. Long alternating codes: 2. Practical search method

    NASA Astrophysics Data System (ADS)

    Markkanen, Markku; NygrN, Tuomo

    1997-01-01

    This paper is the second one in a series explaining a new search method of long alternating codes for incoherent scatter radars. The first paper explains the general idea of the method in terms of a special game of dominoes. This second paper gives an alternative mathematical formalism suitable for computer search. It consists of three rules and a mathematical analysis leading to a formula which can be used in practical search. Although the rules were originally experimental, a mathematical proof of their sufficiency is also given. The method has been used to make a complete search up to a length of 1,048,576 bits. Even longer codes have been found; the longest one known at the moment contains 4,194,304 bits. For demonstration, complete tables of 8-, 16-, 32-, and 64-bit codes and examples of 128- and 256-bit codes are presented.

  16. [Practice analysis, time for oneself].

    PubMed

    Larrose, Bruno

    2014-10-01

    The emotional strain which comes with caregiving affects the feeling of wellbeing at work. The patient's psychological space resonates with that of the caregivers, often without them realising. By offering a form of support which enables caregivers to regulate difficult psychological tensions and maintain their desire for the work, the practice analysis group helps to preserve quality of life at work. PMID:26050403

  17. Methods of Genomic Competency Integration in Practice

    PubMed Central

    Jenkins, Jean; Calzone, Kathleen A.; Caskey, Sarah; Culp, Stacey; Weiner, Marsha; Badzek, Laurie

    2015-01-01

    Purpose Genomics is increasingly relevant to health care, necessitating support for nurses to incorporate genomic competencies into practice. The primary aim of this project was to develop, implement, and evaluate a year-long genomic education intervention that trained, supported, and supervised institutional administrator and educator champion dyads to increase nursing capacity to integrate genomics through assessments of program satisfaction and institutional achieved outcomes. Design Longitudinal study of 23 Magnet Recognition Program® Hospitals (21 intervention, 2 controls) participating in a 1-year new competency integration effort aimed at increasing genomic nursing competency and overcoming barriers to genomics integration in practice. Methods Champion dyads underwent genomic training consisting of one in-person kick-off training meeting followed by monthly education webinars. Champion dyads designed institution-specific action plans detailing objectives, methods or strategies used to engage and educate nursing staff, timeline for implementation, and outcomes achieved. Action plans focused on a minimum of seven genomic priority areas: champion dyad personal development; practice assessment; policy content assessment; staff knowledge needs assessment; staff development; plans for integration; and anticipated obstacles and challenges. Action plans were updated quarterly, outlining progress made as well as inclusion of new methods or strategies. Progress was validated through virtual site visits with the champion dyads and chief nursing officers. Descriptive data were collected on all strategies or methods utilized, and timeline for achievement. Descriptive data were analyzed using content analysis. Findings The complexity of the competency content and the uniqueness of social systems and infrastructure resulted in a significant variation of champion dyad interventions. Conclusions Nursing champions can facilitate change in genomic nursing capacity through varied strategies but require substantial training in order to design and implement interventions. Clinical Relevance Genomics is critical to the practice of all nurses. There is a great opportunity and interest to address genomic knowledge deficits in the practicing nurse workforce as a strategy to improve patient outcomes. Exemplars of champion dyad interventions designed to increase nursing capacity focus on improving education, policy, and healthcare services. PMID:25808828

  18. Development and application to clinical practice of a validated HPLC method for the analysis of β-glucocerebrosidase in Gaucher disease.

    PubMed

    Colomer, E Gras; Gómez, M A Martínez; Alvarez, A González; Martí, M Climente; Moreno, P León; Zarzoso, M Fernández; Jiménez-Torres, N V

    2014-03-01

    The main objective of our study is to develop a simple, fast and reliable method for measuring β-glucocerebrosidase activity in Gaucher patients leukocytes in clinical practice. This measurement may be a useful marker to drive dose selection and early clinical decision making of enzyme replacement therapy. We measure the enzyme activity by high-performance liquid chromatography with ultraviolet detection and 4-nitrophenyl-β-d-glucopyranoside as substrate. A cohort of eight Gaucher patients treated with enzyme replacement therapy and ten healthy controls were tested; median enzyme activity values was 20.57mU/ml (interquartile range 19.92-21.53mU/ml) in patients and mean was 24.73mU/ml (24.12-25.34mU/ml) in the reference group, which allowed the establishment of the normal range of β-glucocerebrosidase activity. The proposed method for leukocytes glucocerebrosidase activity measuring is fast, easy to use, inexpensive and reliable. Furthermore, significant differences between both populations were observed (p=0.008). This suggests that discerning between patients and healthy individuals and providing an approach to enzyme dosage optimization is feasible. This method could be considered as a decision support tool for clinical monitoring. Our study is a first approach to in depth analysis of enzyme replacement therapy and optimization of dosing therapies. PMID:24447963

  19. A Method for Optimizing Waste Management and Disposal Practices Using a Group-Based Uncertainty Model for the Analysis of Characterization Data - 13191

    SciTech Connect

    Simpson, A.; Clapham, M.; Lucero, R.; West, J.

    2013-07-01

    It is a universal requirement for characterization of radioactive waste, that the consignor shall calculate and report a Total Measurement Uncertainty (TMU) value associated with each of the measured quantities such as nuclide activity. For Non-destructive Assay systems, the TMU analysis is typically performed on an individual container basis. However, in many cases, the waste consignor treats, transports, stores and disposes of containers in groups for example by over-packing smaller containers into a larger container or emplacing containers into groups for final disposal. The current standard practice for container-group data analysis is usually to treat each container as independent and uncorrelated and use a simple summation / averaging method (or in some cases summation of TMU in quadrature) to define the overall characteristics and associated uncertainty in the container group. In reality, many groups of containers are assayed on the same system, so there will be a large degree of co-dependence in the individual uncertainty elements. Many uncertainty terms may be significantly reduced when addressing issues such as the source position and variability in matrix contents over large populations. The systematic terms encompass both inherently 'two-directional' random effects (e.g. variation of source position) and other terms that are 'one-directional' i.e. designed to account for potential sources of bias. An analysis has been performed with population groups of a variety of non-destructive assay platforms in order to define a quantitative mechanism for waste consignors to determine overall TMU for batches of containers that have been assayed on the same system. (authors)

  20. Evaluation of agricultural best-management practices in the Conestoga River headwaters, Pennsylvania; methods of data collection and analysis and description of study areas

    USGS Publications Warehouse

    Chichester, Douglas C.

    1988-01-01

    The U.S. Geological Survey is conducting a water quality study as part of the nationally implemented Rural Clean Water Program in the headwaters of the Conestoga River, Pennsylvania. The study, which began in 1982, was designed to determine the effect of agricultural best management practices on surface--and groundwater quality. The study was concentrated in four areas within the intensively farmed, carbonate rock terrane located predominately in Lancaster County, Pennsylvania. These areas were divided into three monitoring components: (1) a Regional study area (188 sq mi): (2) a Small Watershed study area (5.82 sq mi); and (3) two field site study areas, Field-Site 1 (22.1 acres) and Field 2 (47.5 acres). The type of water quality data and the methods of data collection and analysis are presented. The monitoring strategy and description of the study areas are discussed. The locations and descriptions for all data collection locations at the four study areas are provided. (USGS)

  1. The Sherlock Holmes method in clinical practice.

    PubMed

    Sopeña, B

    2014-04-01

    This article lists the integral elements of the Sherlock Holmes method, which is based on the intelligent collection of information through detailed observation, careful listening and thorough examination. The information thus obtained is analyzed to develop the main and alternative hypotheses, which are shaped during the deductive process until the key leading to the solution is revealed. The Holmes investigative method applied to clinical practice highlights the advisability of having physicians reason through and seek out the causes of the disease with the data obtained from acute observation, a detailed review of the medical history and careful physical examination. PMID:24457141

  2. A practical and nontarnishing method for the analysis of trace nickel in hydrogenated cottonseed oil by inductively coupled plasma/mass spectrometry with pressurized PTFE vessel acid digestion.

    PubMed

    Zhang, Ni; Ding, Zhiying; Li, Hao; Wang, Xin; Shao, Xiaodong

    2010-01-01

    A practical and nontarnishing method for the determination of trace nickel (Ni) in hydrogenated cottonseed oil by inductively coupled plasma/mass spectrometry (ICP/MS) was developed. In order to avoid tarnishing in the pretreatment of samples, the technology of pressurized PTFE vessel acid digestion was applied. The temperature and acid content in the digestion were optimized. The results showed that hydrogenated cottonseed oil could be digested completely by the proposed method. Compared with the U.S. Pharmacopeia 28 and British Pharmacopoeia 2003 methods, the developed method avoided the risk of using platinum and the tarnish from silica crucibles. In addition, the analytical cycle of the test solution was shortened by the use of ICP/MS instead of graphite furnace atomic absorption spectrophotometry. PMID:20334194

  3. Practical method for balancing airplane moments

    NASA Technical Reports Server (NTRS)

    Hamburger, H

    1924-01-01

    The present contribution is the sequel to a paper written by Messrs. R. Fuchs, L. Hopf, and H. Hamburger, and proposes to show that the methods therein contained can be practically utilized in computations. Furthermore, the calculations leading up to the diagram of moments for three airplanes, whose performance in war service gave reason for complaint, are analyzed. Finally, it is shown what conclusions can be drawn from the diagram of moments with regard to the defects in these planes and what steps may be taken to remedy them.

  4. Practical Considerations for Using Exploratory Factor Analysis in Educational Research

    ERIC Educational Resources Information Center

    Beavers, Amy S.; Lounsbury, John W.; Richards, Jennifer K.; Huck, Schuyler W.; Skolits, Gary J.; Esquivel, Shelley L.

    2013-01-01

    The uses and methodology of factor analysis are widely debated and discussed, especially the issues of rotational use, methods of confirmatory factor analysis, and adequate sample size. The variety of perspectives and often conflicting opinions can lead to confusion among researchers about best practices for using factor analysis. The focus of the…

  5. Comprehensive rotorcraft analysis methods

    NASA Technical Reports Server (NTRS)

    Stephens, Wendell B.; Austin, Edward E.

    1988-01-01

    The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).

  6. Children's Developing Mathematics in Collective Practices: A Framework for Analysis.

    ERIC Educational Resources Information Center

    Saxe, Geoffrey B.

    2002-01-01

    Presents a cultural-developmental framework for the analysis of children's mathematics in collective practices and illustrates the heuristic value of the framework through the analysis of videotaped episodes drawn from a middle-school classroom. Discusses the promise and limitations of the framework as a method for furthering understanding of the…

  7. System based practice: a concept analysis

    PubMed Central

    YAZDANI, SHAHRAM; HOSSEINI, FAKHROLSADAT; AHMADY, SOLEIMAN

    2016-01-01

    Introduction Systems-Based Practice (SBP) is one of the six competencies introduced by the ACGME for physicians to provide high quality of care and also the most challenging of them in performance, training, and evaluation of medical students. This concept analysis clarifies the concept of SBP by identifying its components to make it possible to differentiate it from other similar concepts. For proper training of SBP and to ensure these competencies in physicians, it is necessary to have an operational definition, and SBP’s components must be precisely defined in order to provide valid and reliable assessment tools. Methods Walker & Avant’s approach to concept analysis was performed in eight stages: choosing a concept, determining the purpose of analysis, identifying all uses of the concept, defining attributes, identifying a model case, identifying borderline, related, and contrary cases, identifying antecedents and consequences, and defining empirical referents. Results Based on the analysis undertaken, the attributes of SBP includes knowledge of the system, balanced decision between patients’ need and system goals, effective role playing in interprofessional health care team, system level of health advocacy, and acting for system improvement. System thinking and a functional system are antecedents and system goals are consequences. A case model, as well as border, and contrary cases of SBP, has been introduced. Conclusion he identification of SBP attributes in this study contributes to the body of knowledge in SBP and reduces the ambiguity of this concept to make it possible for applying it in training of different medical specialties. Also, it would be possible to develop and use more precise tools to evaluate SBP competency by using empirical referents of the analysis. PMID:27104198

  8. Diagnostic use of an analysis of urinary proteins by a practicable sodium dodecyl sulfate-electrophoresis method and rapid two-dimensional electrophoresis.

    PubMed

    Lapin, A; Gabl, F; Kopsa, H

    1989-01-01

    Two methods suitable for routine clinical analyses of urinary proteins are presented and compared. The first is a horizontal sodium dodecyl sulfate-polyacrylamide gel electrophoresis technique, suitable for simultaneous analysis of 20 native urinary samples. This method uses polyacrylamide gradient gels, prepared with a laboratory-built gel casting device. The second method is a rapid two-dimensional electrophoresis procedure, combining cellulose acetate electrophoresis and sodium dodecyl sulfate-electrophoresis. The first step uses a routine system (Chemetron), the second separation step followed by staining with Coomassie Brilliant Blue R is performed on the PhastSystem. The resulting two-dimensional patterns reveal urinary proteins distributed according to the 5-zone pattern of native proteins (albumin, alpha-1, alpha-2, beta, gamma-globulin) as well as to the logarithm of their molecular weights. Examples of (routine) diagnoses with a special interest in the monitoring of kidney transplant patients are shown. PMID:2806208

  9. An Online Forum As a Qualitative Research Method: Practical Issues

    PubMed Central

    Im, Eun-Ok; Chee, Wonshik

    2008-01-01

    Background Despite positive aspects of online forums as a qualitative research method, very little is known about practical issues involved in using online forums for data collection, especially for a qualitative research project. Objectives The purpose of this paper is to describe the practical issues that the researchers encountered in implementing an online forum as a qualitative component of a larger study on cancer pain experience. Method Throughout the study process, the research staff recorded issues ranged from minor technical problems to serious ethical dilemmas as they arose and wrote memos about them. The memos and written records of discussions were reviewed and analyzed using the content analysis suggested by Weber. Results Two practical issues related to credibility were identified: a high response and retention rate and automatic transcripts. An issue related to dependability was the participants’ easy forgetfulness. The issues related to confirmability were difficulties in theoretical saturation and unstandardized computer and Internet jargon. A security issue related to hacking attempts was noted as well. Discussion The analysis of these issues suggests several implications for future researchers who want to use online forums as a qualitative data collection method. PMID:16849979

  10. A collection of research reporting, theoretical analysis, and practical applications in science education: Examining qualitative research methods, action research, educator-researcher partnerships, and constructivist learning theory

    NASA Astrophysics Data System (ADS)

    Hartle, R. Todd

    2007-12-01

    Educator-researcher partnerships are increasingly being used to improve the teaching of science. Chapter 1 provides a summary of the literature concerning partnerships, and examines the justification of qualitative methods in studying these relationships. It also justifies the use of Participatory Action Research (PAR). Empirically-based studies of educator-researcher partnership relationships are rare despite investments in their implementation by the National Science Foundation (NSF) and others. Chapter 2 describes a qualitative research project in which participants in an NSF GK-12 fellowship program were studied using informal observations, focus groups, personal interviews, and journals to identify and characterize the cultural factors that influenced the relationships between the educators and researchers. These factors were organized into ten critical axes encompassing a range of attitudes, behaviors, or values defined by two stereotypical extremes. These axes were: (1) Task Dictates Context vs. Context Dictates Task; (2) Introspection vs. Extroversion; (3) Internal vs. External Source of Success; (4) Prior Planning vs. Implementation Flexibility; (5) Flexible vs. Rigid Time Sense; (6) Focused Time vs. Multi-tasking; (7) Specific Details vs. General Ideas; (8) Critical Feedback vs. Encouragement; (9) Short Procedural vs. Long Content Repetition; and (10) Methods vs. Outcomes are Well Defined. Another ten important stereotypical characteristics, which did not fit the structure of an axis, were identified and characterized. The educator stereotypes were: (1) Rapport/Empathy; (2) Like Kids; (3) People Management; (4) Communication Skills; and (5) Entertaining. The researcher stereotypes were: (1) Community Collaboration; (2) Focus Intensity; (3) Persistent; (4) Pattern Seekers; and (5) Curiosity/Skeptical. Chapter 3 summarizes the research presented in chapter 2 into a practical guide for participants and administrators of educator-researcher partnerships. Understanding how to identify and evaluate constructivist lessons is the first step in promoting and improving constructivism in teaching. Chapter 4 summarizes a theoretically-generated series of practical criteria that define constructivism: (1) Eliciting Prior Knowledge, (2) Creating Cognitive Dissonance, (3) Application of New Knowledge with Feedback, and (4) Reflection on Learning, or Metacognition. These criteria can be used by any practitioner to evaluate the level of constructivism used in a given lesson or activity.

  11. Achieving Integration in Mixed Methods Designs—Principles and Practices

    PubMed Central

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-01-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs—exploratory sequential, explanatory sequential, and convergent—and through four advanced frameworks—multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

  12. Achieving integration in mixed methods designs-principles and practices.

    PubMed

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-12-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs-exploratory sequential, explanatory sequential, and convergent-and through four advanced frameworks-multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

  13. Method of analysis at the U.S. Geological Survey California Water Science Center, Sacramento Laboratory - determination of haloacetic acid formation potential, method validation, and quality-control practices

    USGS Publications Warehouse

    Zazzi, Barbara C.; Crepeau, Kathryn L.; Fram, Miranda S.; Bergamaschi, Brian A.

    2005-01-01

    An analytical method for the determination of haloacetic acid formation potential of water samples has been developed by the U.S. Geological Survey California Water Science Center Sacramento Laboratory. The haloacetic acid formation potential is measured by dosing water samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine. The haloacetic acids formed are bromochloroacetic acid, bromodichloroacetic acid, dibromochloroacetic acid, dibromoacetic acid, dichloroacetic acid, monobromoacetic acid, monochloroacetic acid, tribromoacetic acid, and trichloroacetic acid. They are extracted, methylated, and then analyzed using a gas chromatograph equipped with an electron capture detector. Method validation experiments were performed to determine the method accuracy, precision, and detection limit for each of the compounds. Method detection limits for these nine haloacetic acids ranged from 0.11 to 0.45 microgram per liter. Quality-control practices include the use of blanks, quality-control samples, calibration verification standards, surrogate recovery, internal standard, matrix spikes, and duplicates.

  14. Method of analysis and quality-assurance practices by the U. S. Geological Survey Organic Geochemistry Research Group; determination of four selected mosquito insecticides and a synergist in water using liquid-liquid extraction and gas chrom

    USGS Publications Warehouse

    Zimmerman, L.R.; Strahan, A.P.; Thurman, E.M.

    2001-01-01

    A method of analysis and quality-assurance practices were developed for the determination of four mosquito insecticides (malathion, metho-prene, phenothrin, and resmethrin) and one synergist (piperonyl butoxide) in water. The analytical method uses liquid-liquid extraction (LLE) and gas chromatography/mass spectrometry (GC/MS). Good precision and accuracy were demonstrated in reagent water, urban surface water, and ground water. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 50 nanograms per liter ranged from 68 to 171 percent, with standard deviations in concentrations of 27 nanograms per liter or less. The method detection limit for all compounds was 5.9 nanograms per liter or less for 247-milliliter samples. This method is valuable for acquiring information about the fate and transport of these mosquito insecticides and one synergist in water.

  15. Science Teaching Methods: A Rationale for Practices

    ERIC Educational Resources Information Center

    Osborne, Jonathan

    2011-01-01

    This article is a version of the talk given by Jonathan Osborne as the Association for Science Education (ASE) invited lecturer at the National Science Teachers' Association Annual Convention in San Francisco, USA, in April 2011. The article provides an explanatory justification for teaching about the practices of science in school science that…

  16. [The analysis of the medication error, in practice].

    PubMed

    Didelot, Nicolas; Cistio, Céline

    2016-01-01

    By performing a systemic analysis of medication errors which occur in practice, the multidisciplinary teams can avoid a reoccurrence with the aid of an improvement action plan. The methods must take into account all the factors which might have contributed to or favoured the occurrence of a medication incident or accident. PMID:27177485

  17. Exploratory and Confirmatory Analysis of the Trauma Practices Questionnaire

    ERIC Educational Resources Information Center

    Craig, Carlton D.; Sprang, Ginny

    2009-01-01

    Objective: The present study provides psychometric data for the Trauma Practices Questionnaire (TPQ). Method: A nationally randomized sample of 2,400 surveys was sent to self-identified trauma treatment specialists, and 711 (29.6%) were returned. Results: An exploratory factor analysis (N = 319) conducted on a randomly split sample (RSS) revealed…

  18. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1929-01-01

    This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

  19. Council on Certification Professional Practice Analysis.

    PubMed

    Zaglaniczny, K L

    1993-06-01

    The CCNA has completed a PPA and will begin implementing its recommendations with the December 1993 certification examination. The results of the PPA provide content validation for the CCNA certification examination. The certification examination is reflective of the knowledge and skill required for entry-level practice. Assessment of this knowledge is accomplished through the use of questions that are based on the areas represented in the content outline. Analysis of the PPA has resulted in changes in the examination content outline and percentages of questions in each area to reflect current entry-level nurse anesthesia practice. The new outline is based on the major domains of knowledge required for nurse anesthesia practice. These changes are justified by the consistency in the responses of the practitioners surveyed. There was overall agreement as to the knowledge and skills related to patient conditions, procedures, agents, techniques, and equipment that an entry-level CRNA must have to practice. Members of the CCNA and Examination Committee will use the revised outline to develop questions for the certification examination. The questions will be focused on the areas identified as requiring high levels of expertise and those that appeared higher in frequency. The PPA survey will be used as a basis for subsequent content validation studies. It will be revised to reflect new knowledge, technology, and techniques related to nurse anesthesia practice. The CCNA has demonstrated its commitment to the certification process through completion of the PPA and implementation of changes in the structure of the examination. PMID:8291387

  20. The 5-Step Method: Principles and Practice

    ERIC Educational Resources Information Center

    Copello, Alex; Templeton, Lorna; Orford, Jim; Velleman, Richard

    2010-01-01

    This article includes a description of the 5-Step Method. First, the origins and theoretical basis of the method are briefly described. This is followed by a discussion of the general principles that guide the delivery of the method. Each step is then described in more detail, including the content and focus of each of the five steps that include:…

  1. Airphoto analysis of erosion control practices

    NASA Technical Reports Server (NTRS)

    Morgan, K. M.; Morris-Jones, D. R.; Lee, G. B.; Kiefer, R. W.

    1980-01-01

    The Universal Soil Loss Equation (USLE) is a widely accepted tool for erosion prediction and conservation planning. In this study, airphoto analysis of color and color infrared 70 mm photography at a scale of 1:60,000 was used to determine the erosion control practice factor in the USLE. Information about contour tillage, contour strip cropping, and grass waterways was obtained from aerial photography for Pheasant Branch Creek watershed in Dane County, Wisconsin.

  2. The "Anchor" Method: Principle and Practice.

    ERIC Educational Resources Information Center

    Selgin, Paul

    This report discusses the "anchor" language learning method that is based upon derivation rather than construction, using Italian as an example of a language to be learned. This method borrows from the natural process of language learning as it asks the student to remember whole expressions that serve as vehicles for learning both words and rules,…

  3. Scenistic Methods for Training: Applications and Practice

    ERIC Educational Resources Information Center

    Lyons, Paul R.

    2011-01-01

    Purpose: This paper aims to complement an earlier article (2010) in "Journal of European Industrial Training" in which the description and theory bases of scenistic methods were presented. This paper also offers a description of scenistic methods and information on theory bases. However, the main thrust of this paper is to describe, give suggested…

  4. A practical method for sensor absolute calibration.

    PubMed

    Meisenholder, G W

    1966-04-01

    This paper describes a method of performing sensor calibrations using an NBS standard of spectral irradiance. The method shown, among others, was used for calibration of the Mariner IV Canopus sensor. Agreement of inflight response to preflight calibrations performed by this technique has been found to be well within 10%. PMID:20048890

  5. Practice-Focused Ethnographies of Higher Education: Method/ological Corollaries of a Social Practice Perspective

    ERIC Educational Resources Information Center

    Trowler, Paul Richard

    2014-01-01

    Social practice theory addresses both theoretical and method/ological agendas. To date priority has been given to the former, with writing on the latter tending often to be an afterthought to theoretical expositions or fieldwork accounts. This article gives sustained attention to the method/ological corollaries of a social practice perspective. It…

  6. A Practical Guide to Immunoassay Method Validation

    PubMed Central

    Andreasson, Ulf; Perret-Liaudet, Armand; van Waalwijk van Doorn, Linda J. C.; Blennow, Kaj; Chiasserini, Davide; Engelborghs, Sebastiaan; Fladby, Tormod; Genc, Sermin; Kruse, Niels; Kuiperij, H. Bea; Kulic, Luka; Lewczuk, Piotr; Mollenhauer, Brit; Mroczko, Barbara; Parnetti, Lucilla; Vanmechelen, Eugeen; Verbeek, Marcel M.; Winblad, Bengt; Zetterberg, Henrik; Koel-Simmelink, Marleen; Teunissen, Charlotte E.

    2015-01-01

    Biochemical markers have a central position in the diagnosis and management of patients in clinical medicine, and also in clinical research and drug development, also for brain disorders, such as Alzheimer’s disease. The enzyme-linked immunosorbent assay (ELISA) is frequently used for measurement of low-abundance biomarkers. However, the quality of ELISA methods varies, which may introduce both systematic and random errors. This urges the need for more rigorous control of assay performance, regardless of its use in a research setting, in clinical routine, or drug development. The aim of a method validation is to present objective evidence that a method fulfills the requirements for its intended use. Although much has been published on which parameters to investigate in a method validation, less is available on a detailed level on how to perform the corresponding experiments. To remedy this, standard operating procedures (SOPs) with step-by-step instructions for a number of different validation parameters is included in the present work together with a validation report template, which allow for a well-ordered presentation of the results. Even though the SOPs were developed with the intended use for immunochemical methods and to be used for multicenter evaluations, most of them are generic and can be used for other technologies as well. PMID:26347708

  7. [Theater of Life: theory, method and practice].

    PubMed

    Santiago, L E

    2000-03-01

    Theater of Life is an educational model that integrate theater theories and techniques as a strategy for health education. Theater as an educational technique has become a useful tool for the health context. In this work the author discuss the role of social change as an important element in health education and suggests the use of theatrical techniques for it's promotion. Also offers information about the different approaches of popular theatre and popular education incorporated in this model, Theatre of the Oppressed by A. Boal, Popular Education by P. Freire, Poor Theater by J. Grotowoski and Education for Peace by C. Beristain and P. Cascón and explain the basic principles of each one of them. In the methodology section the author explains the different steps for implementing the strategy: solidarity and connection games, story telling technique and script development, presentation and forum. In the practice section the author shares the process of model development and the significance events that had contribute to their elaboration. PMID:10761208

  8. Practice and Evaluation of Blended Learning with Cross-Cultural Distance Learning in a Foreign Language Class: Using Mix Methods Data Analysis

    ERIC Educational Resources Information Center

    Sugie, Satoko; Mitsugi, Makoto

    2014-01-01

    The Information and Communication Technology (ICT) utilization in Chinese as a "second" foreign language has mainly been focused on Learning Management System (LMS), digital material development, and quantitative analysis of learners' grammatical knowledge. There has been little research that has analyzed the effectiveness of

  9. Practice and Evaluation of Blended Learning with Cross-Cultural Distance Learning in a Foreign Language Class: Using Mix Methods Data Analysis

    ERIC Educational Resources Information Center

    Sugie, Satoko; Mitsugi, Makoto

    2014-01-01

    The Information and Communication Technology (ICT) utilization in Chinese as a "second" foreign language has mainly been focused on Learning Management System (LMS), digital material development, and quantitative analysis of learners' grammatical knowledge. There has been little research that has analyzed the effectiveness of…

  10. Complementing Gender Analysis Methods.

    PubMed

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital. PMID:25941756

  11. Practice-Near and Practice-Distant Methods in Human Services Research

    ERIC Educational Resources Information Center

    Froggett, Lynn; Briggs, Stephen

    2012-01-01

    This article discusses practice-near research in human services, a cluster of methodologies that may include thick description, intensive reflexivity, and the study of emotional and relational processes. Such methods aim to get as near as possible to experiences at the relational interface between institutions and the practice field.…

  12. Reflections on Experiential Teaching Methods: Linking the Classroom to Practice

    ERIC Educational Resources Information Center

    Wehbi, Samantha

    2011-01-01

    This article explores the use of experiential teaching methods in social work education. The literature demonstrates that relying on experiential teaching methods in the classroom can have overwhelmingly positive learning outcomes; however, not much is known about the possible effect of these classroom methods on practice. On the basis of…

  13. Methods of analysis and quality-assurance practices of the U.S. Geological Survey organic laboratory, Sacramento, California; determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Domagalski, Joseph L.; Kuivila, Kathryn M.

    1994-01-01

    Analytical method and quality-assurance practices were developed for a study of the fate and transport of pesticides in the Sacramento-San Joaquin Delta and the Sacramento and San Joaquin River. Water samples were filtered to remove suspended parti- culate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide, and the pesticides were eluted with three 2-milliliter aliquots of hexane:diethyl ether (1:1). The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for analytes determined per 1,500-milliliter samples ranged from 0.006 to 0.047 microgram per liter. Recoveries ranged from 47 to 89 percent for 12 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.05 and 0.26 microgram per liter. The method was modified to improve the pesticide recovery by reducing the sample volume to 1,000 milliliters. Internal standards were added to improve quantitative precision and accuracy. The analysis also was expanded to include a total of 21 pesticides. The method detection limits for 1,000-milliliter samples ranged from 0.022 to 0.129 microgram per liter. Recoveries ranged from 38 to 128 percent for 21 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.10 and 0.75 microgram per liter.

  14. Optimizing Distributed Practice: Theoretical Analysis and Practical Implications

    ERIC Educational Resources Information Center

    Cepeda, Nicholas J.; Coburn, Noriko; Rohrer, Doug; Wixted, John T.; Mozer, Michael C,; Pashler, Harold

    2009-01-01

    More than a century of research shows that increasing the gap between study episodes using the same material can enhance retention, yet little is known about how this so-called distributed practice effect unfolds over nontrivial periods. In two three-session laboratory studies, we examined the effects of gap on retention of foreign vocabulary,…

  15. Development of a method to analyze orthopaedic practice expenses.

    PubMed

    Brinker, M R; Pierce, P; Siegel, G

    2000-03-01

    The purpose of the current investigation was to present a standard method by which an orthopaedic practice can analyze its practice expenses. To accomplish this, a five-step process was developed to analyze practice expenses using a modified version of activity-based costing. In this method, general ledger expenses were assigned to 17 activities that encompass all the tasks and processes typically performed in an orthopaedic practice. These 17 activities were identified in a practice expense study conducted for the American Academy of Orthopaedic Surgeons. To calculate the cost of each activity, financial data were used from a group of 19 orthopaedic surgeons in Houston, Texas. The activities that consumed the largest portion of the employee work force (person hours) were service patients in office (25.0% of all person hours), maintain medical records (13.6% of all person hours), and resolve collection disputes and rebill charges (12.3% of all person hours). The activities that comprised the largest portion of the total expenses were maintain facility (21.4%), service patients in office (16.0%), and sustain business by managing and coordinating practice (13.8%). The five-step process of analyzing practice expenses was relatively easy to perform and it may be used reliably by most orthopaedic practices. PMID:10738440

  16. Methods of Cosmochemical Analysis

    NASA Astrophysics Data System (ADS)

    Lahiri, S.; Maiti, M.

    Some radionuclides, like 10Be (T 1/2 = 1.5 Ma), 14C (T 1/2 = 5,730 years), 26Al (T 1/2 = 0.716 Ma), 53Mn (T 1/2 = 3.7 Ma), and 60Fe (T 1/2 = 1.5 Ma), 146Sm (T 1/2 = 103 Ma), 182Hf (T 1/2 = 9 Ma), 244Pu (T 1/2 = 80 Ma) are either being produced continuously by the interaction of cosmic rays (CR) or might have been produced in supernovae millions of years ago. Analysis of these radionuclides in ultratrace scale has strong influence in almost all branches of sciences, starting from archaeology to biology, nuclear physics to astrophysics. However, measurement of these radionuclides appeared as a borderline problem exploiting their decay properties because of scarcity in natural archives and long half-life. The one and only way seemed to be that of mass measurement. Accelerator mass spectrometry (AMS) is the best suited for this purpose. Apart from AMS, other mass measurement techniques like inductively coupled plasma-mass spectrometry (ICP-MS), thermal ionization mass spectrometry (TIMS), resonant laser ionization mass spectrometry (RIMS), secondary ionization mass spectrometry (SIMS) have also been used with limited sensitivity and approach.

  17. Practical Issues in Component Aging Analysis

    SciTech Connect

    Dana L. Kelly; Andrei Rodionov; Jens Uwe-Klugel

    2008-09-01

    This paper examines practical issues in the statistical analysis of component aging data. These issues center on the stochastic process chosen to model component failures. The two stochastic processes examined are repair same as new, leading to a renewal process, and repair same as old, leading to a nonhomogeneous Poisson process. Under the first assumption, times between failures can treated as statistically independent observations from a stationary process. The common distribution of the times between failures is called the renewal distribution. Under the second process, the times between failures will not be independently and identically distributed, and one cannot simply fit a renewal distribution to the cumulative failure times or the times between failures. The paper illustrates how the assumption made regarding the repair process is crucial to the analysis. Besides the choice of stochastic process, other issues that are discussed include qualitative graphical analysis and simple nonparametric hypothesis tests to help judge which process appears more appropriate. Numerical examples are presented to illustrate the issues discussed in the paper.

  18. Traditional Methods for Mineral Analysis

    NASA Astrophysics Data System (ADS)

    Ward, Robert E.; Carpenter, Charles E.

    This chapter describes traditional methods for analysis of minerals involving titrimetric and colorimetric procedures, and the use of ion selective electrodes. Other traditional methods of mineral analysis include gravimetric titration (i.e., insoluble forms of minerals are precipitated, rinse, dried, and weighed) and redox reactions (i.e., mineral is part of an oxidation-reduction reaction, and product is quantitated). However, these latter two methods will not be covered because they currently are used little in the food industry. The traditional methods that will be described have maintained widespread usage in the food industry despite the development of more modern instrumentation such as atomic absorption spectroscopy and inductively coupled plasma-atomic emission spectroscopy (Chap. 24). Traditional methods generally require chemicals and equipment that are routinely available in an analytical laboratory and are within the experience of most laboratory technicians. Additionally, traditional methods often form the basis for rapid analysis kits (e.g., Quantab®; for salt determination) that are increasingly in demand. Procedures for analysis of minerals of major nutritional or food processing concern are used for illustrative purposes. For additional examples of traditional methods refer to references (1-6). Slight modifications of these traditional methods are often needed for specific foodstuffs to minimize interferences or to be in the range of analytical performance. For analytical requirements for specific foods see the Official Methods of Analysis of AOAC International (5) and related official methods (6).

  19. Multi-criteria decision analysis: Limitations, pitfalls, and practical difficulties

    SciTech Connect

    Kujawski, Edouard

    2003-02-01

    The 2002 Winter Olympics women's figure skating competition is used as a case study to illustrate some of the limitations, pitfalls, and practical difficulties of Multi-Criteria Decision Analysis (MCDA). The paper compares several widely used models for synthesizing the multiple attributes into a single aggregate value. The various MCDA models can provide conflicting rankings of the alternatives for a common set of information even under states of certainty. Analysts involved in MCDA need to deal with the following challenging tasks: (1) selecting an appropriate analysis method, and (2) properly interpreting the results. An additional trap is the availability of software tools that implement specific MCDA models that can beguile the user with quantitative scores. These conclusions are independent of the decision domain and they should help foster better MCDA practices in many fields including systems engineering trade studies.

  20. Breath analysis: translation into clinical practice.

    PubMed

    Brodrick, Emma; Davies, Antony; Neill, Paul; Hanna, Louise; Williams, E Mark

    2015-06-01

    Breath analysis in respiratory disease is a non-invasive technique which has the potential to complement or replace current screening and diagnostic techniques without inconvenience or harm to the patient. Recent advances in ion mobility spectrometry (IMS) have allowed exhaled breath to be analysed rapidly, reliably and robustly thereby facilitating larger studies of exhaled breath profiles in clinical environments. Preliminary studies have demonstrated that volatile organic compound (VOC) breath profiles of people with respiratory disease can be distinguished from healthy control groups but there is a need to validate, standardise and ensure comparability between laboratories before real-time breath analysis becomes a clinical reality. It is also important that breath sampling procedures and methodologies are developed in conjunction with clinicians and the practicalities of working within the clinical setting are considered to allow the full diagnostic potential of these techniques to be realised. A protocol is presented, which has been developed over three years and successfully deployed for quickly and accurately collecting breath samples from 323 respiratory patients recruited from 10 different secondary health care clinics. PMID:25971863

  1. Farmers' Preferences for Methods of Receiving Information on New or Innovative Farming Practices.

    ERIC Educational Resources Information Center

    Riesenberg, Lou E.; Gor, Christopher Obel

    1989-01-01

    Survey of 386 Idaho farmers (response rate 58 percent) identified preferred methods of receiving information on new or innovative farming practices. Analysis revealed preference for interpersonal methods (demonstrations, tours, and field trips) over mass media such as computer-assisted instruction (CAI) and home study, although younger farmers,…

  2. Toward a practical approach for ergodicity analysis

    NASA Astrophysics Data System (ADS)

    Wang, H.; Wang, C.; Zhao, Y.; Lin, X.; Yu, C.

    2015-09-01

    It is of importance to perform hydrological forecast using a finite hydrological time series. Most time series analysis approaches presume a data series to be ergodic without justifying this assumption. This paper presents a practical approach to analyze the mean ergodic property of hydrological processes by means of autocorrelation function evaluation and Augmented Dickey Fuller test, a radial basis function neural network, and the definition of mean ergodicity. The mean ergodicity of precipitation processes at the Lanzhou Rain Gauge Station in the Yellow River basin, the Ankang Rain Gauge Station in Han River, both in China, and at Newberry, MI, USA are analyzed using the proposed approach. The results indicate that the precipitations of March, July, and August in Lanzhou, and of May, June, and August in Ankang have mean ergodicity, whereas, the precipitation of any other calendar month in these two rain gauge stations do not have mean ergodicity. The precipitation of February, May, July, and December in Newberry show ergodic property, although the precipitation of each month shows a clear increasing or decreasing trend.

  3. Practical challenges in the method of controlled Lagrangians

    NASA Astrophysics Data System (ADS)

    Chevva, Konda Reddy

    The method of controlled Lagrangians is an energy shaping control technique for underactuated Lagrangian systems. Energy shaping control design methods are appealing as they retain the underlying nonlinear dynamics and can provide stability results that hold over larger domain than can be obtained using linear design and analysis. The objective of this dissertation is to identify the control challenges in applying the method of controlled Lagrangians to practical engineering problems and to suggest ways to enhance the closed-loop performance of the controller. This dissertation describes a procedure for incorporating artificial gyroscopic forces in the method of controlled Lagrangians. Allowing these energy-conserving forces in the closed-loop system provides greater freedom in tuning closed-loop system performance and expands the class of eligible systems. In energy shaping control methods, physical dissipation terms that are neglected in the control design may enter the system in a way that can compromise stability. This is well illustrated through the "ball on a beam" example. The effect of physical dissipation on the closed-loop dynamics is studied in detail and conditions for stability in the presence of natural damping are discussed. The control technique is applied to the classic "inverted pendulum on a cart" system. A nonlinear controller is developed which asymptotically stabilizes the inverted equilibrium at a specific cart position for the conservative dynamic model. The region of attraction contains all states for which the pendulum is elevated above the horizontal plane. Conditions for asymptotic stability in the presence of linear damping are developed. The nonlinear controller is validated through experiments. Experimental cart damping is best modeled using static and Coulomb friction. Experiments show that static and Coulomb friction degrades the closed-loop performance and induces limit cycles. A Lyapunov-based switching controller is proposed and successfully implemented to suppress the limit cycle oscillations. The Lyapunov-based controller switches between the energy shaping nonlinear controller, for states away from the equilibrium, and a well-tuned linear controller, for states close to the equilibrium. The method of controlled Lagrangians is applied to vehicle systems with internal moving point mass actuators. Applications of moving mass actuators include certain spacecraft, atmospheric re-entry vehicles, and underwater vehicles. Control design using moving mass actuators is challenging; the system is often underactuated and multibody dynamic models are higher dimensional. We consider two examples to illustrate the application of controlled Lagrangian formulation. The first example is a spinning disk, a simplified, planar version of a spacecraft spin stabilization problem. The second example is a planar, streamlined underwater vehicle.

  4. Practical guidelines for B-cell receptor repertoire sequencing analysis.

    PubMed

    Yaari, Gur; Kleinstein, Steven H

    2015-01-01

    High-throughput sequencing of B-cell immunoglobulin repertoires is increasingly being applied to gain insights into the adaptive immune response in healthy individuals and in those with a wide range of diseases. Recent applications include the study of autoimmunity, infection, allergy, cancer and aging. As sequencing technologies continue to improve, these repertoire sequencing experiments are producing ever larger datasets, with tens- to hundreds-of-millions of sequences. These data require specialized bioinformatics pipelines to be analyzed effectively. Numerous methods and tools have been developed to handle different steps of the analysis, and integrated software suites have recently been made available. However, the field has yet to converge on a standard pipeline for data processing and analysis. Common file formats for data sharing are also lacking. Here we provide a set of practical guidelines for B-cell receptor repertoire sequencing analysis, starting from raw sequencing reads and proceeding through pre-processing, determination of population structure, and analysis of repertoire properties. These include methods for unique molecular identifiers and sequencing error correction, V(D)J assignment and detection of novel alleles, clonal assignment, lineage tree construction, somatic hypermutation modeling, selection analysis, and analysis of stereotyped or convergent responses. The guidelines presented here highlight the major steps involved in the analysis of B-cell repertoire sequencing data, along with recommendations on how to avoid common pitfalls. PMID:26589402

  5. Item-Analysis Methods and Their Implications for the ILTA Guidelines for Practice: A Comparison of the Effects of Classical Test Theory and Item Response Theory Models on the Outcome of a High-Stakes Entrance Exam

    ERIC Educational Resources Information Center

    Ellis, David P.

    2011-01-01

    The current version of the International Language Testing Association (ILTA) Guidelines for Practice requires language testers to pretest items before including them on an exam, or when pretesting is not possible, to conduct post-hoc item analysis to ensure any malfunctioning items are excluded from scoring. However, the guidelines are devoid of…

  6. Genre Analysis, ESP and Professional Practice

    ERIC Educational Resources Information Center

    Bhatia, Vijay K.

    2008-01-01

    Studies of professional genres and professional practices are invariably seen as complementing each other, in that they not only influence each other but are often co-constructed in specific professional contexts. However, professional genres have often been analyzed in isolation, leaving the study of professional practice almost completely out,…

  7. Practical Teaching Methods K-6: Sparking the Flame of Learning.

    ERIC Educational Resources Information Center

    Wilkinson, Pamela Fannin.; McNutt, Margaret A.; Friedman, Esther S.

    This book provides state-of-the-art teaching practices and methods, discussing the elements of good teaching in the content areas and including examples from real classrooms and library media centers. Chapters offer reflection exercises, assessment tips specific to each curriculum, and resource lists. Nine chapters examine: (1) "The Premise"…

  8. Compassion fatigue within nursing practice: a concept analysis.

    PubMed

    Coetzee, Siedine Knobloch; Klopper, Hester C

    2010-06-01

    "Compassion fatigue" was first introduced in relation to the study of burnout among nurses, but it was never defined within this context; it has since been adopted as a synonym for secondary traumatic stress disorder, which is far removed from the original meaning of the term. The aim of the study was to define compassion fatigue within nursing practice. The method that was used in this article was concept analysis. The findings revealed several categories of compassion fatigue: risk factors, causes, process, and manifestations. The characteristics of each of these categories are specified and a connotative (theoretical) definition, model case, additional cases, empirical indicators, and a denotative (operational) definition are provided. Compassion fatigue progresses from a state of compassion discomfort to compassion stress and, finally, to compassion fatigue, which if not effaced in its early stages of compassion discomfort or compassion stress, can permanently alter the compassionate ability of the nurse. Recommendations for nursing practice, education, and research are discussed. PMID:20602697

  9. Model-Based Practice Analysis and Test Specifications.

    ERIC Educational Resources Information Center

    Kane, Michael

    1997-01-01

    Licensure and certification decisions are usually based on a chain of inference from results of a practice analysis to test specifications, the test, examinee performance, and a pass-fail decision. This article focuses on the design of practice analyses and translation of practice analyses results into test specifications. (SLD)

  10. Fourier methods for biosequence analysis.

    PubMed Central

    Benson, D C

    1990-01-01

    Novel methods are discussed for using fast Fourier transforms for DNA or protein sequence comparison. These methods are also intended as a contribution to the more general computer science problem of text search. These methods extend the capabilities of previous FFT methods and show that these methods are capable of considerable refinement. In particular, novel methods are given which (1) enable the detection of clusters of matching letters, (2) facilitate the insertion of gaps to enhance sequence similarity, and (3) accommodate to varying densities of letters in the input sequences. These methods use Fourier analysis in two distinct ways. (1) Fast Fourier transforms are used to facilitate rapid computation. (2) Fourier expansions are used to form an 'image' of the sequence comparison. PMID:2243777

  11. A Practice-Based Analysis of an Online Strategy Game

    NASA Astrophysics Data System (ADS)

    Milolidakis, Giannis; Kimble, Chris; Akoumianakis, Demosthenes

    In this paper, we will analyze a massively multiplayer online game in an attempt to identify the elements of practice that enable social interaction and cooperation within the games virtual world. Communities of Practice and Activity Theory offer the theoretical lens for identifying and understanding what constitutes practice within the community and how such practice is manifest and transmitted during game play. Our analysis suggests that in contrast to prevalent perceptions of practice as being textually mediated, in virtual settings it is framed as much in social interactions as in processes, artifacts and the tools constituting the linguistic domain of the game or the practice the gaming community is about.

  12. Practicing the practice: Learning to guide elementary science discussions in a practice-oriented science methods course

    NASA Astrophysics Data System (ADS)

    Shah, Ashima Mathur

    University methods courses are often criticized for telling pre-service teachers, or interns, about the theories behind teaching instead of preparing them to actually enact teaching. Shifting teacher education to be more "practice-oriented," or to focus more explicitly on the work of teaching, is a current trend for re-designing the way we prepare teachers. This dissertation addresses the current need for research that unpacks the shift to more practice-oriented approaches by studying the content and pedagogical approaches in a practice-oriented, masters-level elementary science methods course (n=42 interns). The course focused on preparing interns to guide science classroom discussions. Qualitative data, such as video records of course activities and interns' written reflections, were collected across eight course sessions. Codes were applied at the sentence and paragraph level and then grouped into themes. Five content themes were identified: foregrounding student ideas and questions, steering discussion toward intended learning goals, supporting students to do the cognitive work, enacting teacher role of facilitator, and creating a classroom culture for science discussions. Three pedagogical approach themes were identified. First, the teacher educators created images of science discussions by modeling and showing videos of this practice. They also provided focused teaching experiences by helping interns practice the interactive aspects of teaching both in the methods classroom and with smaller groups of elementary students in schools. Finally, they structured the planning and debriefing phases of teaching so interns could learn from their teaching experiences and prepare well for future experiences. The findings were analyzed through the lens of Grossman and colleagues' framework for teaching practice (2009) to reveal how the pedagogical approaches decomposed, represented, and approximated practice throughout course activities. Also, the teacher educators' purposeful use of both pedagogies of investigation (to study teaching) and pedagogies of enactment (to practice enacting teaching) was uncovered. This work provides insights for the design of courses that prepare interns to translate theories about teaching into the interactive work teachers actually do. Also, it contributes to building a common language for talking about the content of practice-oriented courses and for comparing the affordances and limitations of pedagogical approaches across teacher education settings.

  13. A new practice analysis of hand therapy.

    PubMed

    Muenzen, Patricia M; Kasch, Mary C; Greenberg, Sandra; Fullenwider, Lynnlee; Taylor, Patricia A; Dimick, Mary P

    2002-01-01

    The Hand Therapy Certification Commission, Inc. (HTCC) conducted a role delineation in 2001 to characterize current practice in the profession of hand therapy. Building upon previous HTCC studies of practice (i.e., Chai, Dimick & Kasch, 1987; Roth, Dimick, Kasch, Fullenwider & Taylor, 1996), subject matter experts identified the clinical behaviors, knowledge, and technical skills needed by hand therapists. A large scale survey was conducted with therapists across the United States and Canada who rated the clinical behaviors, knowledge, and technical skills in terms of their relevance to practice, and provided information about their own patient populations. A high survey return rate (72%) was indicative of the professional commitment of CHTs to their profession. Results of the survey are discussed and practice trends are identified. A new test outline for the Hand Therapy Certification Examination was created based on the results of the survey, and the 1987 Definition and Scope of Hand Therapy was revised. PMID:12206324

  14. Practical postcalibration uncertainty analysis: Yucca Mountain, Nevada.

    PubMed

    James, Scott C; Doherty, John E; Eddebbarh, Al-Aziz

    2009-01-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is "calibrated." Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the United States' proposed site for disposal of high-level radioactive waste. Linear and nonlinear uncertainty analyses are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction's uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This article applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. PMID:19744249

  15. Will genomic selection be a practical method for plant breeding?

    PubMed Central

    Nakaya, Akihiro; Isobe, Sachiko N.

    2012-01-01

    Background Genomic selection or genome-wide selection (GS) has been highlighted as a new approach for marker-assisted selection (MAS) in recent years. GS is a form of MAS that selects favourable individuals based on genomic estimated breeding values. Previous studies have suggested the utility of GS, especially for capturing small-effect quantitative trait loci, but GS has not become a popular methodology in the field of plant breeding, possibly because there is insufficient information available on GS for practical use. Scope In this review, GS is discussed from a practical breeding viewpoint. Statistical approaches employed in GS are briefly described, before the recent progress in GS studies is surveyed. GS practices in plant breeding are then reviewed before future prospects are discussed. Conclusions Statistical concepts used in GS are discussed with genetic models and variance decomposition, heritability, breeding value and linear model. Recent progress in GS studies is reviewed with a focus on empirical studies. For the practice of GS in plant breeding, several specific points are discussed including linkage disequilibrium, feature of populations and genotyped markers and breeding scheme. Currently, GS is not perfect, but it is a potent, attractive and valuable approach for plant breeding. This method will be integrated into many practical breeding programmes in the near future with further advances and the maturing of its theory. PMID:22645117

  16. A practical method of estimating energy expenditure during tennis play.

    PubMed

    Novas, A M P; Rowbottom, D G; Jenkins, D G

    2003-03-01

    This study aimed to develop a practical method of estimating energy expenditure (EE) during tennis. Twenty-four elite female tennis players first completed a tennis-specific graded test in which five different Intensity levels were applied randomly. Each intensity level was intended to simulate a "game" of singles tennis and comprised six 14 s periods of activity alternated with 20 s of active rest. Oxygen consumption (VO2) and heart rate (HR) were measured continuously and each player's rate of perceived exertion (RPE) was recorded at the end of each intensity level. Rate of energy expenditure (EE(VO2)) during the test was calculated using the sum of VO2 during play and the 'O2 debt' during recovery, divided by the duration of the activity. There were significant individual linear relationships between EE(VO2) and RPE, EE(VO2) and HR (r > or = 0.89 & r > or = 0.93; p < 0.05). On a second occasion, six players completed a 60-min singles tennis match during which VO2, HR and RPE were recorded; EE(VO2) was compared with EE predicted from the previously derived RPE and HR regression equations. Analysis found that EE(VO2) was overestimated by EE(RPE) (92 +/- 76 kJ x h(-1)) and EE(HR) (435 +/- 678 kJ x h(-1)), but the error of estimation for EE(RPE) (t = -3.01; p = 0.03) was less than 5% whereas for EE(HR) such error was 20.7%. The results of the study show that RPE can be used to estimate the energetic cost of playing tennis. PMID:12801209

  17. Method of multivariate spectral analysis

    DOEpatents

    Keenan, Michael R.; Kotula, Paul G.

    2004-01-06

    A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).

  18. A systematic approach to initial data analysis is good research practice.

    PubMed

    Huebner, Marianne; Vach, Werner; le Cessie, Saskia

    2016-01-01

    Initial data analysis is conducted independently of the analysis needed to address the research questions. Shortcomings in these first steps may result in inappropriate statistical methods or incorrect conclusions. We outline a framework for initial data analysis and illustrate the impact of initial data analysis on research studies. Examples of reporting of initial data analysis in publications are given. A systematic and careful approach to initial data analysis is needed as good research practice. PMID:26602896

  19. [Complaint analysis derived from surgical practice].

    PubMed

    Fajardo-Dolci, Germn; Rodrguez-Surez, Francisco Javier; Campos-Castolo, Esther Mahuina; Carrillo-Jaimes, Arturo; Zavala-Surez, Etelvina; Aguirre-Gas, Hctor Gerardo

    2009-01-01

    This study reports on the analysis of medical complaints presented to the National Commission on Medical Arbitration (Comisin Nacional de Arbitraje Mdico, CONAMED) between June 1996 and December 2007 to determine its magnitude and to identify the causes of safety problems in medical care. Out of 182,407 complaints presented to CONAMED, 87% were resolved by the Office of Orientation and Management. The remaining 18,443 complaints were presented to the Council Directorate. Of those cases, 48% were resolved by an agreement between the complainants and the physicians, 31% were not resolved by this method, and 3% were irresolute complaints. The highest frequency of complaints was registered in the Federal District (Distrito Federal) and the State of Mxico (Estado de Mxico), mainly corresponding to social security institutions and private hospitals. Among the nine most frequently involved specialties, six were surgical specialties. Malpractice was identified in 25% of all cases. The principal demands of those making complaints were the refunding of expenses in patient medical care (51%) and indemnification (40%) and, in those, the average amount of payments was 4.6 times greater. Due to the incidence of medical complaints, it was reasonable to investigate the causes and to take preventive and corrective actions required for its decrease. It was proposed to the Mexican Academy of Surgery that this organization should use their educational leadership and assume the vanguard in the dissemination and promotion of the WHO plan "Safe Surgery Saves Lives" and the implementation in Mexico of the "Surgical Safety Checklist." PMID:19671273

  20. [Good Practice of Secondary Data Analysis (GPS): guidelines and recommendations].

    PubMed

    Swart, E; Gothe, H; Geyer, S; Jaunzeme, J; Maier, B; Grobe, T G; Ihle, P

    2015-02-01

    In 2005, the Working Group for the Survey and Utilisation of Secondary Data (AGENS) of the German Society for Social Medicine and Prevention (DGSMP) and the German Society for Epidemiology (DGEpi) first published "Good Practice in Secondary Data Analysis (GPS)" formulating a standard for conducting secondary data analyses. GPS is intended as a guide for planning and conducting analyses and can provide a basis for contracts between data owners. The domain of these guidelines does not only include data routinely gathered by statutory health insurance funds and further statutory social insurance funds, but all forms of secondary data. The 11 guidelines range from ethical principles and study planning through quality assurance measures and data preparation to data privacy, contractual conditions and responsible communication of analytical results. They are complemented by explanations and practical assistance in the form of recommendations. GPS targets all persons directing their attention to secondary data, their analysis and interpretation from a scientific point of view and by employing scientific methods. This includes data owners. Furthermore, GPS is suitable to assess scientific publications regarding their quality by authors, referees and readers. In 2008, the first version of GPS was evaluated and revised by members of AGENS and the Epidemiological Methods Working Group of DGEpi, DGSMP and GMDS including other epidemiological experts and had then been accredited as implementation regulations of Good Epidemiological Practice (GEP). Since 2012, this third version of GPS is on hand and available for downloading from the DGEpi website at no charge. Especially linguistic specifications have been integrated into the current revision; its internal consistency was increased. With regards to contents, further recommendations concerning the guideline on data privacy have been added. On the basis of future developments in science and data privacy, further revisions will follow. PMID:25622207

  1. Practical Nursing. Ohio's Competency Analysis Profile.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Vocational Instructional Materials Lab.

    Developed through a modified DACUM (Developing a Curriculum) process involving business, industry, labor, and community agency representatives in Ohio, this document is a comprehensive and verified employer competency profile for practical nursing. The list contains units (with and without subunits), competencies, and competency builders that…

  2. Encouraging Gender Analysis in Research Practice

    ERIC Educational Resources Information Center

    Thien, Deborah

    2009-01-01

    Few resources for practical teaching or fieldwork exercises exist which address gender in geographical contexts. This paper adds to teaching and fieldwork resources by describing an experience with designing and implementing a "gender intervention" for a large-scale, multi-university, bilingual research project that brought together a group of…

  3. [Embryo vitrification: French clinical practice analysis for BLEFCO].

    PubMed

    Hesters, L; Achour-Frydman, N; Mandelbaum, J; Levy, R

    2013-09-01

    Frozen thawed embryo transfer is currently an important part of present-day assisted reproductive technology (ART) aiming at increasing the clinical pregnancy rate per oocyte retrieval. Although slow freezing method has been the reference during 2 decades, the recent years witnessed an expansion of ultrarapid cryopreservation method named vitrification. Recently in France, vitrification has been authorized for cryopreserving human embryos. Therefore BLEFCO consortium decides to perform a descriptive study through questionnaires to evaluate the state of vitrification in the French clinical practice. Questionnaires were addressed to the 105 French centres of reproductive biology and 60 were fully completed. Data analysis revealed that embryo survival rate as well as, clinical pregnancy rate were increased after vitrification technology when compared to slow freezing procedure. Overall, these preliminary data suggest that vitrification may improve ART outcomes through an increasing of the cumulative pregnancy rate per oocyte retrieval. PMID:23962680

  4. Method of photon spectral analysis

    DOEpatents

    Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

    1993-04-27

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

  5. Method of photon spectral analysis

    DOEpatents

    Gehrke, Robert J.; Putnam, Marie H.; Killian, E. Wayne; Helmer, Richard G.; Kynaston, Ronnie L.; Goodwin, Scott G.; Johnson, Larry O.

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  6. Practical analysis of welding processes using finite element analysis.

    SciTech Connect

    Cowles, J. H.; Dave, V. R.; Hartman, D. A.

    2001-01-01

    With advances in commercially available finite element software and computational capability, engineers can now model large-scale problems in mechanics, heat transfer, fluid flow, and electromagnetics as never before. With these enhancements in capability, it is increasingly tempting to include the fundamental process physics to help achieve greater accuracy (Refs. 1-7). While this goal is laudable, it adds complication and drives up cost and computational requirements. Practical analysis of welding relies on simplified user inputs to derive important relativistic trends in desired outputs such as residual stress or distortion due to changes in inputs like voltage, current, and travel speed. Welding is a complex three-dimensional phenomenon. The question becomes how much modeling detail is needed to accurately predict relative trends in distortion, residual stress, or weld cracking? In this work, a HAZ (Heat Affected Zone) weld-cracking problem was analyzed to rank two different welding cycles (weld speed varied) in terms of crack susceptibility. Figure 1 shows an aerospace casting GTA welded to a wrought skirt. The essentials of part geometry, welding process, and tooling were suitably captured lo model the strain excursion in the HAZ over a crack-susceptible temperature range, and the weld cycles were suitably ranked. The main contribution of this work is the demonstration of a practical methodology by which engineering solutions to engineering problems may be obtained through weld modeling when time and resources are extremely limited. Typically, welding analysis suffers with the following unknowns: material properties over entire temperature range, the heat-input source term, and environmental effects. Material properties of interest are conductivity, specific heat, latent heat, modulus, Poisson's ratio, yield strength, ultimate strength, and possible rate dependencies. Boundary conditions are conduction into fixturing, radiation and convection to the environment, and any mechanical constraint. If conductivity, for example, is only known at a few temperatures it can be linearly extrapolated from the highest known temperature to the liquidus temperature. Over the liquidus to solidus temperature the conductivity is linearly increased by a factor of three to account for the enhanced heat transfer due to convection in the weld pool. Above the liquidus it is kept constant. Figure 2 shows an example of this type of approximation. Other thermal and mechanical properties and boundary conditions can be similarly approximated, using known physical material characteristics when possible. Sensitivity analysis can show that many assumptions have a small effect on the final outcome of the analysis. In the example presented in this work, simplified analysis procedures were used to model this process to understand why one set of parameters is superior to the other. From Lin (Ref. 8), mechanical strain is expected to drive HAZ cracking. Figure 3 shows a plot of principal tensile mechanical strain versus temperature during the welding process. By looking at the magnitudes of the tensile mechanical strain in the material's Brittle Temperature Region (BTR), it can be seen that on a relative basis the faster travel speed process that causes cracking results in about three times the strain in the temperature range of the BTR. In this work, a series of simplifying assumptions were used in order to quickly and accurately model a real welding process to respond to an immediate manufacturing need. The analysis showed that the driver for HAZ cracking, the mechanical strain in the BTR, was significantly higher in the process that caused cracking versus the process that did not. The main emphasis of the analysis was to determine whether there was a mechanical reason whether the improved weld parameters would consistently produce an acceptable weld, The prediction of the mechanical strain magnitudes confirms the better process.

  7. Practical methods to improve the development of computational software

    SciTech Connect

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-07-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  8. Effectiveness of practices to reduce blood culture contamination: A Laboratory Medicine Best Practices systematic review and meta-analysis?

    PubMed Central

    Snyder, Susan R.; Favoretto, Alessandra M.; Baetz, Rich Ann; Derzon, James H.; Madison, Bereneice M.; Mass, Diana; Shaw, Colleen S.; Layfield, Christopher D.; Christenson, Robert H.; Liebow, Edward B.

    2015-01-01

    Objectives This article is a systematic review of the effectiveness of three practices for reducing blood culture contamination rates: venipuncture, phlebotomy teams, and prepackaged preparation/collection (prep) kits. Design and methods The CDC-funded Laboratory Medicine Best Practices Initiative systematic review methods for quality improvement practices were used. Results Studies included as evidence were: 9 venipuncture (vs. versus intravenous catheter), 5 phlebotomy team; and 7 prep kit. All studies for venipuncture and phlebotomy teams favored these practices, with meta-analysis mean odds ratios for venipuncture of 2.69 and phlebotomy teams of 2.58. For prep kits 6 studies effect sizes were not statistically significantly different from no effect (meta-analysis mean odds ratio 1.12). Conclusions Venipuncture and the use of phlebotomy teams are effective practices for reducing blood culture contamination rates in diverse hospital settings and are recommended as evidence-based best practices with high overall strength of evidence and substantial effect size ratings. No recommendation is made for or against prep kits based on uncertain improvement. PMID:22709932

  9. Landscape analysis: Theoretical considerations and practical needs

    USGS Publications Warehouse

    Godfrey, A.E.; Cleaves, E.T.

    1991-01-01

    Numerous systems of land classification have been proposed. Most have led directly to or have been driven by an author's philosophy of earth-forming processes. However, the practical need of classifying land for planning and management purposes requires that a system lead to predictions of the results of management activities. We propose a landscape classification system composed of 11 units, from realm (a continental mass) to feature (a splash impression). The classification concerns physical aspects rather than economic or social factors; and aims to merge land inventory with dynamic processes. Landscape units are organized using a hierarchical system so that information may be assembled and communicated at different levels of scale and abstraction. Our classification uses a geomorphic systems approach that emphasizes the geologic-geomorphic attributes of the units. Realm, major division, province, and section are formulated by subdividing large units into smaller ones. For the larger units we have followed Fenneman's delineations, which are well established in the North American literature. Areas and districts are aggregated into regions and regions into sections. Units smaller than areas have, in practice, been subdivided into zones and smaller units if required. We developed the theoretical framework embodied in this classification from practical applications aimed at land use planning and land management in Maryland (eastern Piedmont Province near Baltimore) and Utah (eastern Uinta Mountains). ?? 1991 Springer-Verlag New York Inc.

  10. Methods for genetic linkage analysis using trisomies

    SciTech Connect

    Feingold, E.; Lamb, N.E.; Sherman, S.L.

    1994-09-01

    Certain genetic disorders (e.g. congenital cataracts, duodenal atresia) are rare in the general population, but more common in people with Down`s syndrome. We present a method for using individuals with trisomy 21 to map genes for such traits. Our methods are analogous to methods for mapping autosomal dominant traits using affected relative pairs by looking for markers with greater than expected identity-by-descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected reduction to homozygosity in the chromosomes inherited form the non-disjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the gene, a confidence interval for that distance, and methods for computing power and sample sizes. The methods are described in the context of gene-dosage model for the etiology of the disorder, but can be extended to other models. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers, how to test candidate genes, and how to handle the effect of reduced recombination associated with maternal meiosis I non-disjunction.

  11. Situational Analysis: A Framework for Evidence-Based Practice

    ERIC Educational Resources Information Center

    Annan, Jean

    2005-01-01

    Situational analysis is a framework for professional practice and research in educational psychology. The process is guided by a set of practice principles requiring that psychologists' work is evidence-based, ecological, collaborative and constructive. The framework is designed to provide direction for psychologists who wish to tailor their

  12. A practical method to evaluate radiofrequency exposure of mast workers.

    PubMed

    Alanko, Tommi; Hietanen, Maila

    2008-01-01

    Assessment of occupational exposure to radiofrequency (RF) fields in telecommunication transmitter masts is a challenging task. For conventional field strength measurements using manually operated instruments, it is difficult to document the locations of measurements while climbing up a mast. Logging RF dosemeters worn by the workers, on the other hand, do not give any information about the location of the exposure. In this study, a practical method was developed and applied to assess mast workers' exposure to RF fields and the corresponding location. This method uses a logging dosemeter for personal RF exposure evaluation and two logging barometers to determine the corresponding height of the worker's position on the mast. The procedure is not intended to be used for compliance assessments, but to indicate locations where stricter assessments are needed. The applicability of the method is demonstrated by making measurements in a TV and radio transmitting mast. PMID:19054796

  13. Development of a practical costing method for hospitals.

    PubMed

    Cao, Pengyu; Toyabe, Shin-Ichi; Akazawa, Kouhei

    2006-03-01

    To realize an effective cost control, a practical and accurate cost accounting system is indispensable in hospitals. In traditional cost accounting systems, the volume-based costing (VBC) is the most popular cost accounting method. In this method, the indirect costs are allocated to each cost object (services or units of a hospital) using a single indicator named a cost driver (e.g., Labor hours, revenues or the number of patients). However, this method often results in rough and inaccurate results. The activity based costing (ABC) method introduced in the mid 1990s can prove more accurate results. With the ABC method, all events or transactions that cause costs are recognized as "activities", and a specific cost driver is prepared for each activity. Finally, the costs of activities are allocated to cost objects by the corresponding cost driver. However, it is much more complex and costly than other traditional cost accounting methods because the data collection for cost drivers is not always easy. In this study, we developed a simplified ABC (S-ABC) costing method to reduce the workload of ABC costing by reducing the number of cost drivers used in the ABC method. Using the S-ABC method, we estimated the cost of the laboratory tests, and as a result, similarly accurate results were obtained with the ABC method (largest difference was 2.64%). Simultaneously, this new method reduces the seven cost drivers used in the ABC method to four. Moreover, we performed an evaluation using other sample data from physiological laboratory department to certify the effectiveness of this new method. In conclusion, the S-ABC method provides two advantages in comparison to the VBC and ABC methods: (1) it can obtain accurate results, and (2) it is simpler to perform. Once we reduce the number of cost drivers by applying the proposed S-ABC method to the data for the ABC method, we can easily perform the cost accounting using few cost drivers after the second round of costing. PMID:16498229

  14. The practical aspects of clinical trials of contraceptive methods.

    PubMed

    Weisberg, E

    1986-04-01

    Although setting up a clinical trial to test the efficacy of a method of contraception may appear to be a simple exercise, in practice, unless the aim is well delineated, the trial carefully designed, and the staff participating in the trial carefully briefed, problems will arise which prevent a successful outcome. Possible areas of bias such as selection of participants and staff prejudices must be eliminated, otherwise the value of the results may be diminished. Ethical considerations must be addressed regarding voluntary participation, information for participants, delineation of risk to participants and informed consent. PMID:3527399

  15. Voltametric analysis apparatus and method

    DOEpatents

    Almon, Amy C.

    1993-01-01

    An apparatus and method for electrochemical analysis of elements in solution. An auxiliary electrode 14, a reference electrode 18, and five working electrodes 20, 22, 26, 28, and 30 are positioned in a container 12 containing a sample solution 34. The working electrodes are spaced apart evenly from each other and auxiliary electrode 14 to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between auxiliary electrode 14 and each of the working electrodes 20, 22, 26, 28, and 30. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in sample solution 34 and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  16. Voltammetric analysis apparatus and method

    DOEpatents

    Almon, A.C.

    1993-06-08

    An apparatus and method is described for electrochemical analysis of elements in solution. An auxiliary electrode, a reference electrode, and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and the auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between the auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in the sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  17. Correlation method of electrocardiogram analysis

    NASA Astrophysics Data System (ADS)

    Strinadko, Marina M.; Timochko, Katerina B.

    2002-02-01

    The electrocardiograph method is the informational source for functional heart state characteristics. The electrocardiogram parameters are the integrated map of many component characteristics of the heart system and depend on disturbance requirements of each device. In the research work the attempt of making the skeleton diagram of perturbation of the heart system is made by the characteristic description of its basic components and connections between them through transition functions, which are written down by the differential equations of the first and second order with the purpose to build-up and analyze electrocardiogram. Noting the vector character of perturbation and the various position of heart in each organism, we offer own coordinate system connected with heart. The comparative analysis of electrocardiogram was conducted with the usage of correlation method.

  18. Effective methods for disseminating research findings to nurses in practice.

    PubMed

    Cronenwett, L R

    1995-09-01

    Professionals in all disciplines are challenged by the proliferation of new knowledge. Nurses, too, must find cost-effective ways of ensuring that their patients are benefiting from the most current knowledge about health and illness. The methods of research dissemination to clinicians described in this article are presumed to be effective because of anecdotal reports, conference evaluations, or clinician surveys. The profession needs more sophisticated evaluations of the effectiveness of various dissemination methods. In the meantime, whether you are a researcher, an administrator, an educator, or a clinician, you have a role to play in improving research dissemination. Implement just one strategy from this article and evaluate the results. Each contribution moves nursing toward research-based practice. PMID:7567569

  19. Practical Aspects of Krylov Subspace Iterative Methods in CFD

    NASA Technical Reports Server (NTRS)

    Pulliam, Thomas H.; Rogers, Stuart; Barth, Timothy

    1996-01-01

    Implementation issues associated with the application of Krylov subspace iterative methods, such as Newton-GMRES, are presented within the framework of practical computational fluid dynamic (CFD) applications. This paper categorizes, evaluates, and contrasts the major ingredients (function evaluations, matrix-vector products, and preconditioners) of Newton-GMRES Krylov subspace methods in terms of their effect on the local linear and global nonlinear convergence, memory requirements, and accuracy. The discussion focuses on Newton-GMRES in both a structured multi-zone incompressible Navier-Stokes solver and an unstructured mesh finite-volume Navier-Stokes solver. Approximate versus exact matrix-vector products, effective preconditioners, and other pertinent issues are addressed.

  20. International child care practices study: methods and study population.

    PubMed

    Nelson, E A; Taylor, B J

    1999-06-01

    The study set out to document child care practices in as many different countries and cultures as possible with the aim of providing baseline child care data and stimulating new hypotheses to explain persisting differences in sudden infant death syndrome (SIDS) rates between countries. The protocol, piloted in four countries in 1992, was distributed to 80 potential centres in 1995. Data from 19 centres were received. This paper describes the demographic characteristics of the data from the different centres. Comparison showed significant differences for a number of variables including mean age of completion of the study, response rate, mean gestation, mean birth weight, method of delivery and incidence of admission to neonatal intensive care units. High caesarean section rates identified in the Chinese samples (44 and 40%) were unexpected and have important public health implications. This finding warrants further study but may be related to China's one child policy. We consider that international comparison of child care practice is possible using standardised data collection methods that also allow some individual variation according to local circumstances. However, in view of the heterogeneity of the samples, it will be important to avoid over-interpreting differences identified and to view any differences within the qualitative context of each individual sample. Provided there is acknowledgement of limitations, such ecological studies have potential to produce useful information especially for hypothesis generation. PMID:10390090

  1. Comparative Lifecycle Energy Analysis: Theory and Practice.

    ERIC Educational Resources Information Center

    Morris, Jeffrey; Canzoneri, Diana

    1992-01-01

    Explores the position that more energy is conserved through recycling secondary materials than is generated from municipal solid waste incineration. Discusses one component of a lifecycle analysis--a comparison of energy requirements for manufacturing competing products. Includes methodological issues, energy cost estimates, and difficulties

  2. Comparative Lifecycle Energy Analysis: Theory and Practice.

    ERIC Educational Resources Information Center

    Morris, Jeffrey; Canzoneri, Diana

    1992-01-01

    Explores the position that more energy is conserved through recycling secondary materials than is generated from municipal solid waste incineration. Discusses one component of a lifecycle analysis--a comparison of energy requirements for manufacturing competing products. Includes methodological issues, energy cost estimates, and difficulties…

  3. Translational Behavior Analysis and Practical Benefits

    ERIC Educational Resources Information Center

    Pilgrim, Carol

    2011-01-01

    In his article, Critchfield ("Translational Contributions of the Experimental Analysis of Behavior," "The Behavior Analyst," v34, p3-17, 2011) summarizes a previous call (Mace & Critchfield, 2010) for basic scientists to reexamine the inspiration for their research and turn increasingly to translational approaches. Interestingly, rather than

  4. Translational Behavior Analysis and Practical Benefits

    ERIC Educational Resources Information Center

    Pilgrim, Carol

    2011-01-01

    In his article, Critchfield ("Translational Contributions of the Experimental Analysis of Behavior," "The Behavior Analyst," v34, p3-17, 2011) summarizes a previous call (Mace & Critchfield, 2010) for basic scientists to reexamine the inspiration for their research and turn increasingly to translational approaches. Interestingly, rather than…

  5. Progress testing: critical analysis and suggested practices.

    PubMed

    Albanese, Mark; Case, Susan M

    2016-03-01

    Educators have long lamented the tendency of students to engage in rote memorization in preparation for tests rather than engaging in deep learning where they attempt to gain meaning from their studies. Rote memorization driven by objective exams has been termed a steering effect. Progress testing (PT), in which a comprehensive examination sampling all of medicine is administered repeatedly throughout the entire curriculum, was developed with the stated aim of breaking the steering effect of examinations and of promoting deep learning. PT is an approach historically linked to problem-based learning (PBL) although there is a growing recognition of its applicability more broadly. The purpose of this article is to summarize the salient features of PT drawn from the literature, provide a critical review of these features based upon the same literature and psychometric considerations drawn from the Standards for Educational and Psychological Testing and provide considerations of what should be part of best practices in applying PT from an evidence-based and a psychometric perspective. PMID:25662873

  6. Behavioral analysis of tiger night housing practices.

    PubMed

    Miller, Angela; Leighty, Katherine A; Bettinger, Tamara L

    2013-03-01

    The systematic evaluation of changes in animal management practices is critical to ensuring the best possible welfare. Here, we examined the behavioral impacts of intermittently housing our six adult female tigers, who have been housed socially for much of their lives, individually overnight to allow for specialized care required by their advancing age. We looked for behavioral changes indicative of both positive and negative changes in welfare by monitoring time spent asleep, sleeping position, body position while awake, as well as pacing, door pounding, self-grooming, roaring, and chuffing while housed socially as compared to individually overnight. Housing condition did not affect time spent asleep, sleeping positions assumed or the more preferred body positions while awake. Further, pacing, door-pounding, and roaring were infrequent and not altered by housing condition. Self-grooming did increase when housed individually but no evidence of over-grooming was present and chuffing, a close proximity social vocalization, was more likely to occur when socially housed. Taken together, these findings support the notion that transitioning to individual housing as needed is a viable option for managing cats accustomed to being maintained in a social group. PMID:23322596

  7. Methods for Cancer Epigenome Analysis

    PubMed Central

    Nagarajan, Raman P.; Fouse, Shaun D.; Bell, Robert J.A.; Costello, Joseph F.

    2014-01-01

    Accurate detection of epimutations in tumor cells is crucial for understanding the molecular pathogenesis of cancer. Alterations in DNA methylation in cancer are functionally important and clinically relevant, but even this well-studied area is continually re-evaluated in light of unanticipated results, including a strong connection between aberrant DNA methylation in adult tumors and polycomb group profiles in embryonic stem cells, cancer-associated genetic mutations in epigenetic regulators such as DNMT3A and TET family genes, and the discovery of abundant 5-hydroxymethylcytosine, a product of TET proteins acting on 5-methylcytosine, in human tissues. The abundance and distribution of covalent histone modifications in primary cancer tissues relative to normal cells is a largely uncharted area, although there is good evidence for a mechanistic role of cancer-specific alterations in epigenetic marks in tumor etiology, drug response and tumor progression. Meanwhile, the discovery of new epigenetic marks continues, and there are many useful methods for epigenome analysis applicable to primary tumor samples, in addition to cancer cell lines. For DNA methylation and hydroxymethylation, next-generation sequencing allows increasingly inexpensive and quantitative whole-genome profiling. Similarly, the refinement and maturation of chromatin immunoprecipitation with next-generation sequencing (ChIP-seq) has made possible genome-wide mapping of histone modifications, open chromatin and transcription factor binding sites. Computational tools have been developed apace with these epigenome methods to better enable the accuracy and interpretation of the data from the profiling methods. PMID:22956508

  8. Practical Analysis of Genome Contact Interaction Experiments.

    PubMed

    Carty, Mark A; Elemento, Olivier

    2016-01-01

    The three dimensional (3D) architecture of chromosomes is not random but instead tightly organized due to chromatin folding and chromatin interactions between genomically distant loci. By bringing genomically distant functional elements such as enhancers and promoters into close proximity, these interactions play a key role in regulating gene expression. Some of these interactions are dynamic, that is, they differ between cell types, conditions and can be induced by specific stimuli or differentiation events. Other interactions are more structural and stable, that is they are constitutionally present across several cell types. Genome contact interactions can occur via recruitment and physical interaction between chromatin-binding proteins and correlate with epigenetic marks such as histone modifications. Absence of a contact can occur due to presence of insulators, that is, chromatin-bound complexes that physically separate genomic loci. Understanding which contacts occur or do not occur in a given cell type is important since it can help explain how genes are regulated and which functional elements are involved in such regulation. The analysis of genome contact interactions has been greatly facilitated by the relatively recent development of chromosome conformation capture (3C). In an even more recent development, 3C was combined with next generation sequencing and led to Hi-C, a technique that in theory queries all possible pairwise interactions both within the same chromosome (intra) and between chromosomes (inter). Hi-C has now been used to study genome contact interactions in several human and mouse cell types as well as in animal models such as Drosophila and yeast. While it is fair to say that Hi-C has revolutionized the study of chromatin interactions, the computational analysis of Hi-C data is extremely challenging due to the presence of biases, artifacts, random polymer ligation and the huge number of potential pairwise interactions. In this chapter, we outline a strategy for analysis of genome contact experiments based on Hi-C using R and Bioconductor. PMID:27008015

  9. Comparison of four teaching methods on Evidence-based Practice skills of postgraduate nursing students.

    PubMed

    Fernandez, Ritin S; Tran, Duong Thuy; Ramjan, Lucie; Ho, Carey; Gill, Betty

    2014-01-01

    The aim of this study was to compare four teaching methods on the evidence-based practice knowledge and skills of postgraduate nursing students. Students enrolled in the Evidence-based Nursing (EBN) unit in Australia and Hong Kong in 2010 and 2011 received education via either the standard distance teaching method, computer laboratory teaching method, Evidence-based Practice-Digital Video Disc (EBP-DVD) teaching method or the didactic classroom teaching method. Evidence-based Practice (EBP) knowledge and skills were evaluated using student assignments that comprised validated instruments. One-way analysis of covariance was implemented to assess group differences on outcomes after controlling for the effects of age and grade point average (GPA). Data were obtained from 187 students. The crude mean score among students receiving the standard+DVD method of instruction was higher for developing a precise clinical question (8.1±0.8) and identifying the level of evidence (4.6±0.7) compared to those receiving other teaching methods. These differences were statistically significant after controlling for age and grade point average. Significant improvement in cognitive and technical EBP skills can be achieved for postgraduate nursing students by integrating a DVD as part of the EBP teaching resources. The EBP-DVD is an easy teaching method to improve student learning outcomes and ensure that external students receive equivalent and quality learning experiences. PMID:23107585

  10. Flow analysis system and method

    NASA Technical Reports Server (NTRS)

    Hill, Wayne S. (Inventor); Barck, Bruce N. (Inventor)

    1998-01-01

    A non-invasive flow analysis system and method wherein a sensor, such as an acoustic sensor, is coupled to a conduit for transmitting a signal which varies depending on the characteristics of the flow in the conduit. The signal is amplified and there is a filter, responsive to the sensor signal, and tuned to pass a narrow band of frequencies proximate the resonant frequency of the sensor. A demodulator generates an amplitude envelope of the filtered signal and a number of flow indicator quantities are calculated based on variations in amplitude of the amplitude envelope. A neural network, or its equivalent, is then used to determine the flow rate of the flow in the conduit based on the flow indicator quantities.

  11. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  12. Reliability and cost analysis methods

    NASA Technical Reports Server (NTRS)

    Suich, Ronald C.

    1991-01-01

    In the design phase of a system, how does a design engineer or manager choose between a subsystem with .990 reliability and a more costly subsystem with .995 reliability? When is the increased cost justified? High reliability is not necessarily an end in itself but may be desirable in order to reduce the expected cost due to subsystem failure. However, this may not be the wisest use of funds since the expected cost due to subsystem failure is not the only cost involved. The subsystem itself may be very costly. We should not consider either the cost of the subsystem or the expected cost due to subsystem failure separately but should minimize the total of the two costs, i.e., the total of the cost of the subsystem plus the expected cost due to subsystem failure. This final report discusses the Combined Analysis of Reliability, Redundancy, and Cost (CARRAC) methods which were developed under Grant Number NAG 3-1100 from the NASA Lewis Research Center. CARRAC methods and a CARRAC computer program employ five models which can be used to cover a wide range of problems. The models contain an option which can include repair of failed modules.

  13. Analysis Methods of Magnesium Chips

    NASA Astrophysics Data System (ADS)

    Ohmann, Sven; Ditze, André; Scharf, Christiane

    2015-11-01

    The quality of recycled magnesium from chips depends strongly on their exposure to inorganic and organic impurities that are added during the production processes. Different kinds of magnesium chips from these processes were analyzed by several methods. In addition, the accuracy and effectiveness of the methods are discussed. The results show that the chips belong either to the AZ91, AZ31, AM50/60, or AJ62 alloy. Some kinds of chips show deviations from the above-mentioned normations. Different impurities result mainly from transition metals and lime. The water and oil content does not exceed 25%, and the chip size is not more than 4 mm in the diameter. The sieve analysis shows good results for oily and wet chips. The determination of oil and water shows better results for the application of a Soxhlet compared with the addition of lime and vacuum distillation. The most accurate values for the determination of water and oil are obtained by drying at 110°C (for water) and washing with acetone (for oil) by hand.

  14. Meta-analysis of family-centered helpgiving practices research.

    PubMed

    Dunst, Carl J; Trivette, Carol M; Hamby, Deborah W

    2007-01-01

    A meta-analysis of 47 studies investigating the relationship between family-centered helpgiving practices and parent, family, and child behavior and functioning is reported. The studies included more than 11,000 participants from seven different countries. Data analysis was guided by a practice-based theory of family-centered helpgiving that hypothesized direct effects of relational and participatory helpgiving practices on self-efficacy beliefs and parent, family, and child outcomes. Results showed that the largest majority of outcomes were related to helpgiving practices with the strongest influences on outcomes most proximal and contextual to help giver/help receiver exchanges. Findings are placed in the context of a broader-based social systems framework of early childhood intervention and family support. PMID:17979208

  15. SAR/QSAR methods in public health practice

    SciTech Connect

    Demchuk, Eugene Ruiz, Patricia; Chou, Selene; Fowler, Bruce A.

    2011-07-15

    Methods of (Quantitative) Structure-Activity Relationship ((Q)SAR) modeling play an important and active role in ATSDR programs in support of the Agency mission to protect human populations from exposure to environmental contaminants. They are used for cross-chemical extrapolation to complement the traditional toxicological approach when chemical-specific information is unavailable. SAR and QSAR methods are used to investigate adverse health effects and exposure levels, bioavailability, and pharmacokinetic properties of hazardous chemical compounds. They are applied as a part of an integrated systematic approach in the development of Health Guidance Values (HGVs), such as ATSDR Minimal Risk Levels, which are used to protect populations exposed to toxic chemicals at hazardous waste sites. (Q)SAR analyses are incorporated into ATSDR documents (such as the toxicological profiles and chemical-specific health consultations) to support environmental health assessments, prioritization of environmental chemical hazards, and to improve study design, when filling the priority data needs (PDNs) as mandated by Congress, in instances when experimental information is insufficient. These cases are illustrated by several examples, which explain how ATSDR applies (Q)SAR methods in public health practice.

  16. Methods of prescribing relative exercise intensity: physiological and practical considerations.

    PubMed

    Mann, Theresa; Lamberts, Robert Patrick; Lambert, Michael Ian

    2013-07-01

    Exercise prescribed according to relative intensity is a routine feature in the exercise science literature and is intended to produce an approximately equivalent exercise stress in individuals with different absolute exercise capacities. The traditional approach has been to prescribe exercise intensity as a percentage of maximal oxygen uptake (VO2max) or maximum heart rate (HRmax) and these methods remain common in the literature. However, exercise intensity prescribed at a %VO2max or %HRmax does not necessarily place individuals at an equivalent intensity above resting levels. Furthermore, some individuals may be above and others below metabolic thresholds such as the aerobic threshold (AerT) or anaerobic threshold (AnT) at the same %VO2max or %HRmax. For these reasons, some authors have recommended that exercise intensity be prescribed relative to oxygen consumption reserve (VO2R), heart rate reserve (HRR), the AerT, or the AnT rather than relative to VO2max or HRmax. The aim of this review was to compare the physiological and practical implications of using each of these methods of relative exercise intensity prescription for research trials or training sessions. It is well established that an exercise bout at a fixed %VO2max or %HRmax may produce interindividual variation in blood lactate accumulation and a similar effect has been shown when relating exercise intensity to VO2R or HRR. Although individual variation in other markers of metabolic stress have seldom been reported, it is assumed that these responses would be similarly heterogeneous at a %VO2max, %HRmax, %VO2R, or %HRR of moderate-to-high intensity. In contrast, exercise prescribed relative to the AerT or AnT would be expected to produce less individual variation in metabolic responses and less individual variation in time to exhaustion at a constant exercise intensity. Furthermore, it would be expected that training prescribed relative to the AerT or AnT would provide a more homogenous training stimulus than training prescribed as a %VO2max. However, many of these theoretical advantages of threshold-related exercise prescription have yet to be directly demonstrated. On a practical level, the use of threshold-related exercise prescription has distinct disadvantages compared to the use of %VO2max or %HRmax. Thresholds determined from single incremental tests cannot be assumed to be accurate in all individuals without verification trials. Verification trials would involve two or three additional laboratory visits and would add considerably to the testing burden on both the participant and researcher. Threshold determination and verification would also involve blood lactate sampling, which is aversive to some participants and has a number of intrinsic and extrinsic sources of variation. Threshold measurements also tend to show higher day-to-day variation than VO2max or HRmax. In summary, each method of prescribing relative exercise intensity has both advantages and disadvantages when both theoretical and practical considerations are taken into account. It follows that the most appropriate method of relative exercise intensity prescription may vary with factors such as exercise intensity, number of participants, and participant characteristics. Considering a method's limitations as well as advantages and increased reporting of individual exercise responses will facilitate accurate interpretation of findings and help to identify areas for further study. PMID:23620244

  17. Practical methods for retrace error correction in nonnull aspheric testing.

    PubMed

    Liu, Dong; Yang, Yongying; Tian, Chao; Luo, Yongjie; Wang, Lin

    2009-04-27

    Nonnull test is often adopted for aspheric testing. But due to its violation of null condition, the testing rays will follow different paths from the reference and aberrations from the interferometer will not cancel out, leading to widely difference between the obtained surface figure and that of the real, which is called the Retrace-error accordingly. In this paper, retrace error of nonnull aspheric testing is analyzed in detail with conclusions that retrace error has much to do with the aperture, F number and surface shape error of the aspheric under test. Correcting methods are proposed according to the manner of the retrace errors. Both computer simulation and experimental results show that the proposed methods can correct the retrace error effectively. The analysis and proposed correction methods bring much to the application of nonnull aspheric testing. PMID:19399077

  18. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  19. Reducing alcohol consumption. Comparing three brief methods in family practice.

    PubMed Central

    McIntosh, M. C.; Leigh, G.; Baldwin, N. J.; Marmulak, J.

    1997-01-01

    OBJECTIVE: To compare the effects of three brief methods of reducing alcohol consumption among family practice patients. DESIGN: Patients randomly assigned to one of three interventions were assessed initially and at 3-, 6-, and 12-month follow-up appointments. SETTING: Family practice clinic composed of 12 primary care physicians seeing approximately 6000 adults monthly in a small urban community, population 40,000. PARTICIPANTS: Through a screening questionnaire, 134 men and 131 women were identified as hazardous drinkers (five or more drinks at least once monthly) during an 11-month screening of 1420 patients. Of 265 patients approached, 180 agreed to participate and 159 (83 men and 76 women) actually participated in the study. INTERVENTIONS: Three interventions were studied: brief physician advice (5 minutes), two 30-minute sessions with a physician using cognitive behavioural strategies or two 30-minute sessions with a nurse practitioner using identical strategies. MAIN OUTCOME MEASURES: Quantity and frequency (QF) of drinking were used to assess reduction in hazardous drinking and problems related to drinking over 12 months of follow up. RESULTS: No statistical difference between groups was found. The QF of monthly drinking was reduced overall by 66% (among men) and 74% (among women) for those reporting at least one hazardous drinking day weekly at assessment (N = 96). Men reported drinking significantly more than women. CONCLUSIONS: These results indicated that offering brief, specific advice can motivate patients to reduce their alcohol intake. There was no difference in effect between brief advice from their own physician or brief intervention by a physician or a nurse. PMID:9386883

  20. Systems analysis and design methodologies: practicalities and use in today's information systems development efforts.

    PubMed

    Jerva, M

    2001-05-01

    Historically, systems analysis and design methodologies have been used as a guide in software development. Such methods provide structure to software engineers in their efforts to create quality solutions in the real world of information systems. This article looks at the elements that constitute a systems analysis methodology and examines the historical development of systems analysis in software development. It concludes with observations on the strengths and weaknesses of four methodologies and the state of the art of practice today. PMID:11378979

  1. A practical method for determining organ dose during CT examination.

    PubMed

    Cheung, Tsang; Cheng, Qijun; Feng, Dinghua

    2007-02-01

    A practical method, based on depth dose, for determining organ dose during computed tomography (CT) examination is presented. For 4-slice spiral CT scans, performed at radii of 0, 37.5, 75.0, 112.5, and 150.0 mm, measurement of depth dose has been made using thermoluminescent dosimeters (TLDs) inserted into a modified International Electrotechnical Commission (IEC) standard dosimetry phantom and also additional TLDs placed on the surface of the phantom. A regression equation-linking dose with distance from the center of the phantom has been formulated, from which dose to a point of interest relative to the surface dose can also be calculated. The approximation reflects the attenuation properties of X-rays in the phantom. Using the equation, an estimate of organ dose can be ascertained for CT examination, assuming water equivalence of human tissue and a known organ position and volume. Using the 4-slice spiral scanner, relative doses to a patients' lung have been calculated, the location and size of the lung in vivo being found from the CT scan image, and the lung being divided into 38 segments to calculate the relative dose. Results from our test case show the dose to the lung to have been 69+/-13% of surface dose. PMID:16979343

  2. Practical Framework: Implementing OEE Method in Manufacturing Process Environment

    NASA Astrophysics Data System (ADS)

    Maideen, N. C.; Sahudin, S.; Mohd Yahya, N. H.; Norliawati, A. O.

    2016-02-01

    Manufacturing process environment requires reliable machineries in order to be able to satisfy the market demand. Ideally, a reliable machine is expected to be operated and produce a quality product at its maximum designed capability. However, due to some reason, the machine usually unable to achieved the desired performance. Since the performance will affect the productivity of the system, a measurement technique should be applied. Overall Equipment Effectiveness (OEE) is a good method to measure the performance of the machine. The reliable result produced from OEE can then be used to propose a suitable corrective action. There are a lot of published paper mentioned about the purpose and benefit of OEE that covers what and why factors. However, the how factor not yet been revealed especially the implementation of OEE in manufacturing process environment. Thus, this paper presents a practical framework to implement OEE and a case study has been discussed to explain in detail each steps proposed. The proposed framework is beneficial to the engineer especially the beginner to start measure their machine performance and later improve the performance of the machine.

  3. A Model of Practice in Special Education: Dynamic Ecological Analysis

    ERIC Educational Resources Information Center

    Hannant, Barbara; Lim, Eng Leong; McAllum, Ruth

    2010-01-01

    Dynamic Ecological Analysis (DEA) is a model of practice which increases a teams' efficacy by enabling the development of more effective interventions through collaboration and collective reflection. This process has proved to be useful in: a) clarifying thinking and problem-solving, b) transferring knowledge and thinking to significant parties,…

  4. [Practice analysis: culture shock and adaptation at work].

    PubMed

    Philippe, Séverine; Didry, Pascale

    2015-12-01

    Constructed as a practice analysis, this personal account presents the reflection undertaken by a student on placement in Ireland thanks to the Erasmus programme. She describes in detail the stages of her adaptation in a hospital setting which is considerably different to her usual environment. PMID:26654501

  5. Procedural Fidelity: An Analysis of Measurement and Reporting Practices

    ERIC Educational Resources Information Center

    Ledford, Jennifer R.; Wolery, Mark

    2013-01-01

    A systematic analysis was conducted of measurement and reporting practices related to procedural fidelity in single-case research for the past 30 years. Previous reviews of fidelity primarily reported whether fidelity data were collected by authors; these reviews reported that collection was variable, but low across journals and over time. Results

  6. Procedural Fidelity: An Analysis of Measurement and Reporting Practices

    ERIC Educational Resources Information Center

    Ledford, Jennifer R.; Wolery, Mark

    2013-01-01

    A systematic analysis was conducted of measurement and reporting practices related to procedural fidelity in single-case research for the past 30 years. Previous reviews of fidelity primarily reported whether fidelity data were collected by authors; these reviews reported that collection was variable, but low across journals and over time. Results…

  7. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    ERIC Educational Resources Information Center

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we…

  8. Developing patient reference groups within general practice: a mixed-methods study

    PubMed Central

    Smiddy, Jane; Reay, Joanne; Peckham, Stephen; Williams, Lorraine; Wilson, Patricia

    2015-01-01

    Background Clinical commissioning groups (CCGs) are required to demonstrate meaningful patient and public engagement and involvement (PPEI). Recent health service reforms have included financial incentives for general practices to develop patient reference groups (PRGs). Aim To explore the impact of the patient participation direct enhanced service (DES) on development of PRGs, the influence of PRGs on decision making within general practice, and their interface with CCGs. Design and setting A mixed-methods approach within three case study sites in England. Method Three case study sites were tracked for 18 months as part of an evaluation of PPEI in commissioning. A sub-study focused on PRGs utilising documentary and web-based analysis; results were mapped against findings of the main study. Results Evidence highlighted variations in the establishment of PRGs, with the number of active PRGs via practice websites ranging from 27% to 93%. Such groups were given a number of descriptions such as patient reference groups, patient participation groups, and patient forums. Data analysis highlighted that the mode of operation varied between virtual and tangible groups and whether they were GP- or patient-led, such analysis enabled the construction of a typology of PRGs. Evidence reviewed suggested that groups functioned within parameters of the DES with activities limited to practice level. Data analysis highlighted a lack of strategic vision in relation to such groups, particularly their role within an overall patient and PPEI framework). Conclusion Findings identified diversity in the operationalisation of PRGs. Their development does not appear linked to a strategic vision or overall PPEI framework. Although local pragmatic issues are important to patients, GPs must ensure that PRGs develop strategic direction if health reforms are to be addressed. PMID:25733439

  9. Methods of stability analysis in nonlinear mechanics

    SciTech Connect

    Warnock, R.L.; Ruth, R.D.; Gabella, W.; Ecklund, K.

    1989-01-01

    We review our recent work on methods to study stability in nonlinear mechanics, especially for the problems of particle accelerators, and compare our ideals to those of other authors. We emphasize methods that (1) show promise as practical design tools, (2) are effective when the nonlinearity is large, and (3) have a strong theoretical basis. 24 refs., 2 figs., 2 tabs.

  10. Perceptions of Weight and Health Practices in Hispanic Children: A Mixed-Methods Study

    PubMed Central

    Foster, Byron Alexander; Hale, Daniel

    2015-01-01

    Background. Perception of weight by parents of obese children may be associated with willingness to engage in behavior change. The relationship between parents' perception of their child's weight and their health beliefs and practices is poorly understood, especially among the Hispanic population which experiences disparities in childhood obesity. This study sought to explore the relationship between perceptions of weight and health beliefs and practices in a Hispanic population. Methods. A cross-sectional, mixed-methods approach was used with semistructured interviews conducted with parent-child (2–5 years old) dyads in a primarily Hispanic, low-income population. Parents were queried on their perceptions of their child's health, health practices, activities, behaviors, and beliefs. A grounded theory approach was used to analyze participants' discussion of health practices and behaviors. Results. Forty parent-child dyads completed the interview. Most (58%) of the parents of overweight and obese children misclassified their child's weight status. The qualitative analysis showed that accurate perception of weight was associated with internal motivation and more concrete ideas of what healthy meant for their child. Conclusions. The qualitative data suggest there may be populations at different stages of readiness for change among parents of overweight and obese children, incorporating this understanding should be considered for interventions. PMID:26379715

  11. Failure analysis methods for capacitors

    NASA Technical Reports Server (NTRS)

    Hildenbrand, R. L.

    1981-01-01

    The basic steps in the failure analysis of discrete capacitors used in electronic circuit boards and hybrid assemblies are described. These steps include: visual examination; functional test; disassembly; isolation of the failure site; and documentation.

  12. Prognostic Analysis System and Methods of Operation

    NASA Technical Reports Server (NTRS)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)

    2014-01-01

    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  13. Convergence analysis of combinations of different methods

    SciTech Connect

    Kang, Y.

    1994-12-31

    This paper provides a convergence analysis for combinations of different numerical methods for solving systems of differential equations. The author proves that combinations of two convergent linear multistep methods or Runge-Kutta methods produce a new convergent method of which the order is equal to the smaller order of the two original methods.

  14. Adapting the Six Category Intervention Analysis To Promote Facilitative Type Supervisory Feedback in Teaching Practice.

    ERIC Educational Resources Information Center

    Hamid, Bahiyah Abdul; Azman, Hazita

    A discussion of the supervision preservice language teacher trainees focuses on supervisory methods designed to facilitate clear, useful, enabling feedback to the trainee. Specifically, it looks at use of the Six Category Intervention Analysis, a model for interpersonal skills training, for supervision of teaching practice. The model is seen here…

  15. A Meta-Analysis of Published School Social Work Practice Studies: 1980-2007

    ERIC Educational Resources Information Center

    Franklin, Cynthia; Kim, Johnny S.; Tripodi, Stephen J.

    2009-01-01

    Objective: This systematic review examined the effectiveness of school social work practices using meta-analytic techniques. Method: Hierarchical linear modeling software was used to calculate overall effect size estimates as well as test for between-study variability. Results: A total of 21 studies were included in the final analysis.…

  16. Adapting Job Analysis Methodology to Improve Evaluation Practice

    ERIC Educational Resources Information Center

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  17. Adapting Job Analysis Methodology to Improve Evaluation Practice

    ERIC Educational Resources Information Center

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs

  18. Nursing documentation and nursing practice: a discourse analysis.

    PubMed

    Heartfield, M

    1996-07-01

    Nursing documentation exists as a daily reality of nurses' work. It is interpreted by some as the evidence of nursing actions and dismissed by others as a misrepresentation of nursing care. This paper reports on a study of nursing documentation as nursing practice. The work of Foucault and discourse analysis provide a research design for examination of how written descriptions of patient events taken from patient case notes result from hegemonic influences that construct a knowledge and therefore a practice of nursing. Discourses as ways of understanding knowledge as language, social practices and power relations are used to identify how nursing documentation functions as a manifestation and ritual of power relations. A focus on body work and fragmented bodies provided details of nursing's participation in the discursive construction of the object patient and invisible nurse. It is through resistances to documentation that alternative knowledge of nursing exists. PMID:8807383

  19. Interprofessional collaborative practice within cancer teams: Translating evidence into action. A mixed methods study protocol

    PubMed Central

    2010-01-01

    Background A regional integrated cancer network has implemented a program (educational workshops, reflective and mentoring activities) designed to support the uptake of evidence-informed interprofessional collaborative practices (referred to in this text as EIPCP) within cancer teams. This research project, which relates to the Registered Nurses' Association of Ontario (RNAO) Best Practice Guidelines and other sources of research evidence, represents a unique opportunity to learn more about the factors and processes involved in the translation of evidence-based recommendations into professional practices. The planned study seeks to address context-specific challenges and the concerns of nurses and other stakeholders regarding the uptake of evidence-based recommendations to effectively promote and support interprofessional collaborative practices. Aim This study aims to examine the uptake of evidence-based recommendations from best practice guidelines intended to enhance interprofessional collaborative practices within cancer teams. Design The planned study constitutes a practical trial, defined as a trial designed to provide comprehensive information that is grounded in real-world healthcare dynamics. An exploratory mixed methods study design will be used. It will involve collecting quantitative data to assess professionals' knowledge and attitudes, as well as practice environment factors associated with effective uptake of evidence-based recommendations. Semi-structured interviews will be conducted concurrently with care providers to gather qualitative data for describing the processes involved in the translation of evidence into action from both the users' (n = 12) and providers' (n = 24) perspectives. The Graham et al. Ottawa Model of Research Use will serve to construct operational definitions of concepts, and to establish the initial coding labels to be used in the thematic analysis of the qualitative data. Quantitative and qualitative results will be merged during interpretation to provide complementary perspectives of interrelated contextual factors that enhance the uptake of EIPCP and changes in professional practices. Discussion The information obtained from the study will produce new knowledge on the interventions and sources of support most conducive to the uptake of evidence and building of capacity to sustain new interprofessional collaborative practice patterns. It will provide new information on strategies for overcoming barriers to evidence-informed interventions. The findings will also pinpoint critical determinants of 'what works and why' taking into account the interplay between evidence, operational, relational micro-processes of care, uniqueness of patients' needs and preferences, and the local context. PMID:20626858

  20. Root Cause Analysis: Methods and Mindsets.

    ERIC Educational Resources Information Center

    Kluch, Jacob H.

    This instructional unit is intended for use in training operations personnel and others involved in scram analysis at nuclear power plants in the techniques of root cause analysis. Four lessons are included. The first lesson provides an overview of the goals and benefits of the root cause analysis method. Root cause analysis techniques are covered…

  1. Methods of DNA methylation analysis.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The purpose of this review was to provide guidance for investigators who are new to the field of DNA methylation analysis. Epigenetics is the study of mitotically heritable alterations in gene expression potential that are not mediated by changes in DNA sequence. Recently, it has become clear that n...

  2. Polydispersity analysis of Taylor dispersion data: the cumulant method.

    PubMed

    Cipelletti, Luca; Biron, Jean-Philippe; Martin, Michel; Cottet, Hervé

    2014-07-01

    Taylor dispersion analysis is an increasingly popular characterization method that measures the diffusion coefficient, and hence the hydrodynamic radius, of (bio)polymers, nanoparticles, or even small molecules. In this work, we describe an extension to current data analysis schemes that allows size polydispersity to be quantified for an arbitrary sample, thereby significantly enhancing the potentiality of Taylor dispersion analysis. The method is based on a cumulant development similar to that used for the analysis of dynamic light scattering data. Specific challenges posed by the cumulant analysis of Taylor dispersion data are discussed, and practical ways to address them are proposed. We successfully test this new method by analyzing both simulated and experimental data for solutions of moderately polydisperse polymers and polymer mixtures. PMID:24937011

  3. The practical implementation of integrated safety management for nuclear safety analysis and fire hazards analysis documentation

    SciTech Connect

    COLLOPY, M.T.

    1999-05-04

    In 1995 Mr. Joseph DiNunno of the Defense Nuclear Facilities Safety Board issued an approach to describe the concept of an integrated safety management program which incorporates hazard and safety analysis to address a multitude of hazards affecting the public, worker, property, and the environment. Since then the U S . Department of Energy (DOE) has adopted a policy to systematically integrate safety into management and work practices at all levels so that missions can be completed while protecting the public, worker, and the environment. While the DOE and its contractors possessed a variety of processes for analyzing fire hazards at a facility, activity, and job; the outcome and assumptions of these processes have not always been consistent for similar types of hazards within the safety analysis and the fire hazard analysis. Although the safety analysis and the fire hazard analysis are driven by different DOE Orders and requirements, these analyses should not be entirely independent and their preparation should be integrated to ensure consistency of assumptions, consequences, design considerations, and other controls. Under the DOE policy to implement an integrated safety management system, identification of hazards must be evaluated and agreed upon to ensure that the public. the workers. and the environment are protected from adverse consequences. The DOE program and contractor management need a uniform, up-to-date reference with which to plan. budget, and manage nuclear programs. It is crucial that DOE understand the hazards and risks necessarily to authorize the work needed to be performed. If integrated safety management is not incorporated into the preparation of the safety analysis and the fire hazard analysis, inconsistencies between assumptions, consequences, design considerations, and controls may occur that affect safety. Furthermore, confusion created by inconsistencies may occur in the DOE process to grant authorization of the work. In accordance with the integrated safety management system approach for having a uniform and consistent process: a method has been suggested by the U S . Department of Energy at Richland and the Project Hanford Procedures when fire hazard analyses and safety analyses are required. This process provides for a common basis approach in the development of the fire hazard analysis and the safety analysis. This process permits the preparers of both documents to jointly participate in the development of the hazard analysis process. This paper presents this method to implement the integrated safety management approach in the development of the fire hazard analysis and safety analysis that provides consistency of assumptions. consequences, design considerations, and other controls necessarily to protect workers, the public. and the environment.

  4. Microparticle analysis system and method

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    2007-01-01

    A device for analyzing microparticles is provided which includes a chamber with an inlet and an outlet for respectively introducing and dispensing a flowing fluid comprising microparticles, a light source for providing light through the chamber and a photometer for measuring the intensity of light transmitted through individual microparticles. The device further includes an imaging system for acquiring images of the fluid. In some cases, the device may be configured to identify and determine a quantity of the microparticles within the fluid. Consequently, a method for identifying and tracking microparticles in motion is contemplated herein. The method involves flowing a fluid comprising microparticles in laminar motion through a chamber, transmitting light through the fluid, measuring the intensities of the light transmitted through the microparticles, imaging the fluid a plurality of times and comparing at least some of the intensities of light between different images of the fluid.

  5. Current status of methods for shielding analysis

    SciTech Connect

    Engle, W.W.

    1980-01-01

    Current methods used in shielding analysis and recent improvements in those methods are discussed. The status of methods development is discussed based on needs cited at the 1977 International Conference on Reactor Shielding. Additional areas where methods development is needed are discussed.

  6. Practical Aspects of the Equation-Error Method for Aircraft Parameter Estimation

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene a.

    2006-01-01

    Various practical aspects of the equation-error approach to aircraft parameter estimation were examined. The analysis was based on simulated flight data from an F-16 nonlinear simulation, with realistic noise sequences added to the computed aircraft responses. This approach exposes issues related to the parameter estimation techniques and results, because the true parameter values are known for simulation data. The issues studied include differentiating noisy time series, maximum likelihood parameter estimation, biases in equation-error parameter estimates, accurate computation of estimated parameter error bounds, comparisons of equation-error parameter estimates with output-error parameter estimates, analyzing data from multiple maneuvers, data collinearity, and frequency-domain methods.

  7. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1931-01-01

    The revised report includes the chart for the analysis of aircraft accidents, combining consideration of the immediate causes, underlying causes, and results of accidents, as prepared by the special committee, with a number of the definitions clarified. A brief statement of the organization and work of the special committee and of the Committee on Aircraft Accidents; and statistical tables giving a comparison of the types of accidents and causes of accidents in the military services on the one hand and in civil aviation on the other, together with explanations of some of the important differences noted in these tables.

  8. Methods used by Dental Practice-Based Research Network (DPBRN) dentists to diagnose dental caries

    PubMed Central

    Gordan, Valeria V.; Riley, Joseph L; Carvalho, Ricardo M.; Snyder, John; Sanderson, James L; Anderson, Mary; Gilbert, Gregg H.

    2010-01-01

    Objectives To (1) identify the methods that dentists in The Dental Practice-Based Research Network (DPBRN) use to diagnose dental caries; (2) quantify their frequency of use; and (3) test the hypothesis that certain dentist and dental practice characteristics are significantly associated with their use. Methods A questionnaire about methods used for caries diagnosis was sent to DPBRN dentists who reported doing at least some restorative dentistry; 522 dentists participated. Questions included use of dental radiographs, dental explorer, laser fluorescence, air-drying, fiber optic devices, and magnification, as used when diagnosing primary, secondary/recurrent, or non-specific caries lesions. Variations on the frequency of their use were tested using multivariate analysis and Bonferroni tests. Results Overall, the dental explorer was the instrument most commonly used to detect primary occlusal caries as well as to detect caries at the margins of existing restorations. In contrast, laser fluorescence was rarely used to help diagnose occlusal primary caries. For proximal caries, radiographs were used to help diagnose 75-100% of lesions by 96% of the DPBRN dentists. Dentists who use radiographs most often to assess proximal surfaces of posterior teeth, were significantly more likely to also report providing a higher percentage of patients with individualized caries prevention (p = .040) and seeing a higher percentage of pediatric patients (p = .001). Conclusion Use of specific diagnostic methods varied substantially. The dental explorer and radiographs are still the most commonly used diagnostic methods. PMID:21488724

  9. Coal Field Fire Fighting - Practiced methods, strategies and tactics

    NASA Astrophysics Data System (ADS)

    Wündrich, T.; Korten, A. A.; Barth, U. H.

    2009-04-01

    Subsurface coal fires destroy millions of tons of coal each year, have an immense impact to the ecological surrounding and threaten further coal reservoirs. Due to enormous dimensions a coal seam fire can develop, high operational expenses are needed. As part of the Sino-German coal fire research initiative "Innovative technologies for exploration, extinction and monitoring of coal fires in Northern China" the research team of University of Wuppertal (BUW) focuses on fire extinction strategies and tactics as well as aspects of environmental and health safety. Besides the choice and the correct application of different extinction techniques further factors are essential for the successful extinction. Appropriate tactics, well trained and protected personnel and the choice of the best fitting extinguishing agents are necessary for the successful extinction of a coal seam fire. The chosen strategy for an extinction campaign is generally determined by urgency and importance. It may depend on national objectives and concepts of coal conservation, on environmental protection (e.g. commitment to green house gases (GHG) reductions), national funding and resources for fire fighting (e.g. personnel, infrastructure, vehicles, water pipelines); and computer-aided models and simulations of coal fire development from self ignition to extinction. In order to devise an optimal fire fighting strategy, "aims of protection" have to be defined in a first step. These may be: - directly affected coal seams; - neighboring seams and coalfields; - GHG emissions into the atmosphere; - Returns on investments (costs of fire fighting compared to value of saved coal). In a further step, it is imperative to decide whether the budget shall define the results, or the results define the budget; i.e. whether there are fixed objectives for the mission that will dictate the overall budget, or whether the limited resources available shall set the scope within which the best possible results shall be achieved. For an effective and efficient fire fighting optimal tactics are requiered and can be divided into four fundamental tactics to control fire hazards: - Defense (digging away the coal, so that the coal can not begin to burn; or forming a barrier, so that the fire can not reach the not burning coal), - Rescue the coal (coal mining of a not burning seam), - Attack (active and direct cooling of burning seam), - Retreat (only monitoring till self-extinction of a burning seam). The last one is used when a fire exceeds the organizational and/or technical scope of a mission. In other words, "to control a coal fire" does not automatically and in all situations mean "to extinguish a coal fire". Best-practice tactics or a combination of them can be selected for control of a particular coal fire. For the extinguishing works different extinguishing agents are available. They can be applied by different application techniques and varying distinctive operating expenses. One application method may be the drilling of boreholes from the surface or covering the surface with low permeability soils. The mainly used extinction agents for coal field fire are as followed: Water (with or without additives), Slurry, Foaming mud/slurry, Inert gases, Dry chemicals and materials and Cryogenic agents. Because of its tremendous dimension and its complexity the worldwide challenge of coal fires is absolutely unique - it can only be solved with functional application methods, best fitting strategies and tactics, organisation and research as well as the dedication of the involved fire fighters, who work under extreme individual risks on the burning coal fields.

  10. Efficient methods and practical guidelines for simulating isotope effects

    NASA Astrophysics Data System (ADS)

    Ceriotti, Michele; Markland, Thomas E.

    2013-01-01

    The shift in chemical equilibria due to isotope substitution is frequently exploited to obtain insight into a wide variety of chemical and physical processes. It is a purely quantum mechanical effect, which can be computed exactly using simulations based on the path integral formalism. Here we discuss how these techniques can be made dramatically more efficient, and how they ultimately outperform quasi-harmonic approximations to treat quantum liquids not only in terms of accuracy, but also in terms of computational cost. To achieve this goal we introduce path integral quantum mechanics estimators based on free energy perturbation, which enable the evaluation of isotope effects using only a single path integral molecular dynamics trajectory of the naturally abundant isotope. We use as an example the calculation of the free energy change associated with H/D and 16O/18O substitutions in liquid water, and of the fractionation of those isotopes between the liquid and the vapor phase. In doing so, we demonstrate and discuss quantitatively the relative benefits of each approach, thereby providing a set of guidelines that should facilitate the choice of the most appropriate method in different, commonly encountered scenarios. The efficiency of the estimators we introduce and the analysis that we perform should in particular facilitate accurate ab initio calculation of isotope effects in condensed phase systems.

  11. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2012-05-29

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  12. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  13. MODELS AND METHODS IN PRACTICAL BIOLOGY FOR SECONDARY SCHOOLS.

    ERIC Educational Resources Information Center

    BELFIELD, W.

    THIS BOOK WAS WRITTEN TO FUNCTION AS A STUDENT LABORATORY MANUAL OR AS TEACHER RESOURCE MATERIAL FOR DEVELOPING A LABORATORY-CENTERED COURSE IN PRACTICAL BIOLOGY FOR STUDENTS IN THE 13-16 AGE GROUP. IT WAS DESIGNED TO SUPPLY A COMPREHENSIVE SET OF EXPERIMENTS WHICH, WHEN CARRIED OUT IN CONJUNCTION WITH NORMAL THEORETICAL AND ANATOMICAL STUDIES,…

  14. Vibration analysis methods for piping

    NASA Astrophysics Data System (ADS)

    Gibert, R. J.

    1981-09-01

    Attention is given to flow vibrations in pipe flow induced by singularity points in the piping system. The types of pressure fluctuations induced by flow singularities are examined, including the intense wideband fluctuations immediately downstream of the singularity and the acoustic fluctuations encountered in the remainder of the circuit, and a theory of noise generation by unsteady flow in internal acoustics is developed. The response of the piping systems to the pressure fluctuations thus generated is considered, and the calculation of the modal characteristics of piping containing a dense fluid in order to obtain the system transfer function is discussed. The TEDEL program, which calculates the vibratory response of a structure composed of straight and curved pipes with variable mechanical characteristics forming a three-dimensional network by a finite element method, is then presented, and calculations of fluid-structural coupling in tubular networks are illustrated.

  15. Governance of professional nursing practice in a hospital setting: a mixed methods study1

    PubMed Central

    dos Santos, José Luís Guedes; Erdmann, Alacoque Lorenzini

    2015-01-01

    Objective: to elaborate an interpretative model for the governance of professional nursing practice in a hospital setting. Method: a mixed methods study with concurrent triangulation strategy, using data from a cross-sectional study with 106 nurses and a Grounded Theory study with 63 participants. The quantitative data were collected through the Brazilian Nursing Work Index - Revised and underwent descriptive statistical analysis. Qualitative data were obtained from interviews and analyzed through initial, selective and focused coding. Results: based on the results obtained with the Brazilian Nursing Work Index - Revised, it is possible to state that nurses perceived that they had autonomy, control over the environment, good relationships with physicians and organizational support for nursing governance. The governance of the professional nursing practice is based on the management of nursing care and services carried out by the nurses. To perform these tasks, nurses aim to get around the constraints of the organizational support and develop management knowledge and skills. Conclusion: it is important to reorganize the structures and processes of nursing governance, especially the support provided by the organization for the management practices of nurses. PMID:26625992

  16. Text analysis methods, text analysis apparatuses, and articles of manufacture

    DOEpatents

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  17. Component outage data analysis methods. Volume 2: Basic statistical methods

    NASA Astrophysics Data System (ADS)

    Marshall, J. A.; Mazumdar, M.; McCutchan, D. A.

    1981-08-01

    Statistical methods for analyzing outage data on major power system components such as generating units, transmission lines, and transformers are identified. The analysis methods produce outage statistics from component failure and repair data that help in understanding the failure causes and failure modes of various types of components. Methods for forecasting outage statistics for those components used in the evaluation of system reliability are emphasized.

  18. Analysis of swimming performance: perceptions and practices of US-based swimming coaches.

    PubMed

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; Newell, John; Quinlan, Leo Richard; ÓLaighin, Gearóid

    2016-06-01

    In elite swimming, a broad range of methods are used to assess performance, inform coaching practices and monitor athletic progression. The aim of this paper was to examine the performance analysis practices of swimming coaches and to explore the reasons behind the decisions that coaches take when analysing performance. Survey data were analysed from 298 Level 3 competitive swimming coaches (245 male, 53 female) based in the United States. Results were compiled to provide a generalised picture of practices and perceptions and to examine key emerging themes. It was found that a disparity exists between the importance swim coaches place on biomechanical analysis of swimming performance and the types of analyses that are actually conducted. Video-based methods are most frequently employed, with over 70% of coaches using these methods at least monthly, with analyses being mainly qualitative in nature rather than quantitative. Barriers to the more widespread use of quantitative biomechanical analysis in elite swimming environments were explored. Constraints include time, cost and availability of resources, but other factors such as sources of information on swimming performance and analysis and control over service provision are also discussed, with particular emphasis on video-based methods and emerging sensor-based technologies. PMID:26359951

  19. Practical design methods for barrier pillars. Information circular/1995

    SciTech Connect

    Koehler, J.R.; Tadolini, S.C.

    1995-11-01

    Effective barrier pillar design is essential for safe and productive underground coal mining. This U.S. Bureau of Mines report presents an overview of available barrier pillar design methodologies that incorporate sound engineering principles while remaining practical for everyday usage. Nomographs and examples are presented to assist in the determination of proper barrier pillar sizing. Additionally, performance evaluation techniques and criteria are included to assist in determining the effectiveness of selected barrier pillar configurations.

  20. Honesty in critically reflective essays: an analysis of student practice.

    PubMed

    Maloney, Stephen; Tai, Joanna Hong-Meng; Lo, Kristin; Molloy, Elizabeth; Ilic, Dragan

    2013-10-01

    In health professional education, reflective practice is seen as a potential means for self-improvement from everyday clinical encounters. This study aims to examine the level of student honesty in critical reflection, and barriers and facilitators for students engaging in honest reflection. Third year physiotherapy students, completing summative reflective essays on clinical encounters using the modified Gibbs cycle, were invited to participate in an anonymous online survey. Student knowledge and beliefs about reflective practice, and disclosure of the truthfulness of their reflections, were assessed using a mixed method approach. A total of 34 students, from a maximum possible of 48 (71 %), participated in the study activities. A total of 68 % stated that they were at least 80 % truthful about their experiences. There was general student consensus that reflective practice was important for their growth as a clinician. Students questioned the belief that the reflection needed to be based on a factual experience. Reflective practice can be a valuable addition to the clinical education of health care professionals, although this value can be diminished through dishonest reflections if it is not carefully implemented. Student influences on honest reflection include; (1) the design of any assessment criteria, and (2) student knowledge and competency in applying critical reflection. PMID:22926807

  1. Practice patterns in FNA technique: A survey analysis

    PubMed Central

    DiMaio, Christopher J; Buscaglia, Jonathan M; Gross, Seth A; Aslanian, Harry R; Goodman, Adam J; Ho, Sammy; Kim, Michelle K; Pais, Shireen; Schnoll-Sussman, Felice; Sethi, Amrita; Siddiqui, Uzma D; Robbins, David H; Adler, Douglas G; Nagula, Satish

    2014-01-01

    AIM: To ascertain fine needle aspiration (FNA) techniques by endosonographers with varying levels of experience and environments. METHODS: A survey study was performed on United States based endosonographers. The subjects completed an anonymous online electronic survey. The main outcome measurements were differences in needle choice, FNA technique, and clinical decision making among endosonographers and how this relates to years in practice, volume of EUS-FNA procedures, and practice environment. RESULTS: A total of 210 (30.8%) endosonographers completed the survey. Just over half (51.4%) identified themselves as academic/university-based practitioners. The vast majority of respondents (77.1%) identified themselves as high-volume endoscopic ultrasound (EUS) (> 150 EUS/year) and high-volume FNA (> 75 FNA/year) performers (73.3). If final cytology is non-diagnostic, high-volume EUS physicians were more likely than low volume physicians to repeat FNA with a core needle (60.5% vs 31.2%; P = 0.0004), and low volume physicians were more likely to refer patients for either surgical or percutaneous biopsy, (33.4% vs 4.9%, P < 0.0001). Academic physicians were more likely to repeat FNA with a core needle (66.7%) compared to community physicians (40.2%, P < 0.001). CONCLUSION: There is significant variation in EUS-FNA practices among United States endosonographers. Differences appear to be related to EUS volume and practice environment. PMID:25324922

  2. An analysis of revenues and expenses in a hospital-based ambulatory pediatric practice.

    PubMed

    Berkelhamer, J E; Rojek, K J

    1988-05-01

    We developed a method of analyzing revenues and expenses in a hospital-based ambulatory pediatric practice. Results of an analysis of the Children's Medical Group (CMG) at the University of Chicago Medical Center demonstrate how changes in collection rates, practice expenses, and hospital underwriting contribute to the financial outcome of the practice. In this analysis, certain programmatic goals of the CMG are achieved at a level of just under 12,000 patient visits per year. At this activity level, pediatric residency program needs are met and income to the CMG physicians is maximized. An ethical problem from the physician's perspective is created by seeking profit maximization. To accomplish this end, the CMG physicians would have to restrict their personal services to only the better-paying patients. This study serves to underscore the importance of hospital-based physicians and hospital administrators structuring fiscal incentives for physicians that mutually meet the institutional goals for the hospital and its physicians. PMID:3358399

  3. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures.

    PubMed

    Rock, Adam J; Coventry, William L; Morgan, Methuen I; Loi, Natasha M

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology. PMID:27014147

  4. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures

    PubMed Central

    Rock, Adam J.; Coventry, William L.; Morgan, Methuen I.; Loi, Natasha M.

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology. PMID:27014147

  5. A Practical Guide to Interpretation of Large Collections of Incident Narratives Using the QUORUM Method

    NASA Technical Reports Server (NTRS)

    McGreevy, Michael W.

    1997-01-01

    Analysis of incident reports plays an important role in aviation safety. Typically, a narrative description, written by a participant, is a central part of an incident report. Because there are so many reports, and the narratives contain so much detail, it can be difficult to efficiently and effectively recognize patterns among them. Recognizing and addressing recurring problems, however, is vital to continuing safety in commercial aviation operations. A practical way to interpret large collections of incident narratives is to apply the QUORUM method of text analysis, modeling, and relevance ranking. In this paper, QUORUM text analysis and modeling are surveyed, and QUORUM relevance ranking is described in detail with many examples. The examples are based on several large collections of reports from the Aviation Safety Reporting System (ASRS) database, and a collection of news stories describing the disaster of TWA Flight 800, the Boeing 747 which exploded in mid- air and crashed near Long Island, New York, on July 17, 1996. Reader familiarity with this disaster should make the relevance-ranking examples more understandable. The ASRS examples illustrate the practical application of QUORUM relevance ranking.

  6. An evaluation of fracture analysis methods

    NASA Technical Reports Server (NTRS)

    Newman, J. C., Jr.

    1985-01-01

    The results of an experimental and predictive round robin on the applications of fracture analysis methods are presented. The objective of the round robin was to verify whether fracture analysis methods currently in use can or cannot predict failure loads on complex structural components containing cracks. Fracture results from tests on a number of compact specimens were used to make the predictions. The accuracy of the prediction methods was evaluated in terms of the variation in the ratio of predicted to experimental failure loads, and the predictions methods are ranked in order of minimum standard error. A range of applicability of the different methods was also considered in assessing their usefulness. For 7075-T651 aluminum alloy, the best methods were: the effective K sub R curve; the critical crack-tip opening displacement (CTOD) criterion using a finite element analysis; and the K sub R curve with the Dugdale model. For the 2024-T351 aluminum alloy, the best methods included: the two-parameter fracture criterion (TPFC); the CTOD parameter using finite element analysis; the K-curve with the Dugdale model; the deformation plasticity failure assessment diagram (DPFAD); and the effective K sub R curve with a limit load condition. For 304 stainless steel, the best methods were the limit load analysis; the CTOD criterion using finite-element analysis TPFC and DPFAD. Some sample experimental results are given in an appendix.

  7. Overhead analysis in a surgical practice: a brief communication.

    PubMed

    Frezza, Eldo E

    2006-08-01

    Evaluating overhead is an essential part of any business, including that of the surgeon. By examining each component of overhead, the surgeon will have a better grasp of the profitability of his or her practice. The overhead discussed in this article includes health insurance, overtime, supply costs, rent, advertising and marketing, telephone costs, and malpractice insurance. While the importance of evaluating and controlling overhead in a business is well understood, few know that overhead increases do not always imply increased expenses. National standards have been provided by the Medical Group Management Association. One method of evaluating overhead is to calculate the amount spent in terms of percent of net revenue. Net revenue includes income from patients, from interest, and from insurers less refunds. Another way for surgeons to evaluate their practice is to calculate income and expenses for two years, then calculate the variance between the two years and the percentage of variance to see where they stand. PMID:16968190

  8. Scharz Preconditioners for Krylov Methods: Theory and Practice

    SciTech Connect

    Szyld, Daniel B.

    2013-05-10

    Several numerical methods were produced and analyzed. The main thrust of the work relates to inexact Krylov subspace methods for the solution of linear systems of equations arising from the discretization of partial di#11;erential equa- tions. These are iterative methods, i.e., where an approximation is obtained and at each step. Usually, a matrix-vector product is needed at each iteration. In the inexact methods, this product (or the application of a preconditioner) can be done inexactly. Schwarz methods, based on domain decompositions, are excellent preconditioners for thise systems. We contributed towards their under- standing from an algebraic point of view, developed new ones, and studied their performance in the inexact setting. We also worked on combinatorial problems to help de#12;ne the algebraic partition of the domains, with the needed overlap, as well as PDE-constraint optimization using the above-mentioned inexact Krylov subspace methods.

  9. Analysis of flight equipment purchasing practices of representative air carriers

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The process through which representative air carriers decide whether or not to purchase flight equipment was investigated as well as their practices and policies in retiring surplus aircraft. An analysis of the flight equipment investment decision process in ten airlines shows that for the airline industry as a whole, the flight equipment investment decision is in a state of transition from a wholly informal process in earliest years to a much more organized and structured process in the future. Individual air carriers are in different stages with respect to the formality and sophistication associated with the flight equipment investment decision.

  10. Benefit of formal analysis in the sale of a dental practice.

    PubMed

    Fos, P J

    1990-01-01

    Although many practice administration decisions involve everyday operations, some, such as the sale of a dental practice, are unique. This paper presents discussion of the complexities inherent in this problem and a method with which to approach it. Concepts of decision analysis are employed to aid in the selection of a selling price, as well as accounting for the uncertainty involved in evaluating prospective buyers. A decision-tree diagram is used to fully represent the decision problem and to illustrate possible decision options. PMID:2084219

  11. Finite-key analysis of a practical decoy-state high-dimensional quantum key distribution

    NASA Astrophysics Data System (ADS)

    Bao, Haize; Bao, Wansu; Wang, Yang; Zhou, Chun; Chen, Ruike

    2016-05-01

    Compared with two-level quantum key distribution (QKD), high-dimensional QKD enables two distant parties to share a secret key at a higher rate. We provide a finite-key security analysis for the recently proposed practical high-dimensional decoy-state QKD protocol based on time-energy entanglement. We employ two methods to estimate the statistical fluctuation of the postselection probability and give a tighter bound on the secure-key capacity. By numerical evaluation, we show the finite-key effect on the secure-key capacity in different conditions. Moreover, our approach could be used to optimize parameters in practical implementations of high-dimensional QKD.

  12. Investigating the Efficacy of Practical Skill Teaching: A Pilot-Study Comparing Three Educational Methods

    ERIC Educational Resources Information Center

    Maloney, Stephen; Storr, Michael; Paynter, Sophie; Morgan, Prue; Ilic, Dragan

    2013-01-01

    Effective education of practical skills can alter clinician behaviour, positively influence patient outcomes, and reduce the risk of patient harm. This study compares the efficacy of two innovative practical skill teaching methods, against a traditional teaching method. Year three pre-clinical physiotherapy students consented to participate in a…

  13. Assessing Student Perception of Practice Evaluation Knowledge in Introductory Research Methods

    ERIC Educational Resources Information Center

    Baker, Lisa R.; Pollio, David E.; Hudson, Ashley

    2011-01-01

    The authors explored the use of the Practice Evaluation Knowledge Scale (PEKS) to assess student perception of acquisition and retention of practice evaluation knowledge from an undergraduate research methods class. The authors sampled 2 semesters of undergraduate social work students enrolled in an introductory research methods course.

  14. Assessing Student Perception of Practice Evaluation Knowledge in Introductory Research Methods

    ERIC Educational Resources Information Center

    Baker, Lisa R.; Pollio, David E.; Hudson, Ashley

    2011-01-01

    The authors explored the use of the Practice Evaluation Knowledge Scale (PEKS) to assess student perception of acquisition and retention of practice evaluation knowledge from an undergraduate research methods class. The authors sampled 2 semesters of undergraduate social work students enrolled in an introductory research methods course.…

  15. Learning Practice-Based Research Methods: Capturing the Experiences of MSW Students

    ERIC Educational Resources Information Center

    Natland, Sidsel; Weissinger, Erika; Graaf, Genevieve; Carnochan, Sarah

    2016-01-01

    The literature on teaching research methods to social work students identifies many challenges, such as dealing with the tensions related to producing research relevant to practice, access to data to teach practice-based research, and limited student interest in learning research methods. This is an exploratory study of the learning experiences of…

  16. Strength-based Supervision: Frameworks, Current Practice, and Future Directions A Wu-wei Method.

    ERIC Educational Resources Information Center

    Edwards, Jeffrey K.; Chen, Mei-Whei

    1999-01-01

    Discusses a method of counseling supervision similar to the wu-wei practice in Zen and Taoism. Suggests that this strength-based method and an understanding of isomorphy in supervisory relationships are the preferred practice for the supervision of family counselors. States that this model of supervision potentiates the person-of-the-counselor.…

  17. Interprofessional dietary assessment practices in primary care: A mixed-methods study.

    PubMed

    Bonilla, Carolina; Brauer, Paula; Royall, Dawna; Keller, Heather; Hanning, Rhona M; DiCenso, Alba

    2016-01-01

    Patients in primary care (PC) are often counselled on diet, and assessment of current food intake is a necessary prerequisite for individualized nutrition care. This sequential mixed-methods study explored current diet assessment (DA) practices in team-based PC in Ontario, Canada, with interdisciplinary focus groups (FGs) followed by a web-based survey. Eleven FGs (n = 50) discussed key patient groups and health conditions requiring DA, as well as facilitators and barriers to accurate DA. Interpretative analysis revealed three themes: DA as a common activity that differed by health profession, communication of DA results within the team, and nutrition care as a collaborative team activity. A total of 191 providers from 73 Family Health Teams completed the web-based survey, and confirmed that many providers are frequently doing DA and that methods vary by discipline. Most providers conducted DAs every day or almost every day. As expected, dietitians used more formal and detailed methods to assess diet than other disciplines, who were more likely to ask a few pointed questions. These baseline data provide information on the range of current DA practices in team-based PC that can inform development of new, more accurate approaches that may improve counselling effectiveness. PMID:26789793

  18. Grounded Theory in Practice: Is It Inherently a Mixed Method?

    ERIC Educational Resources Information Center

    Johnson, R. B.; McGowan, M. W.; Turner, L. A.

    2010-01-01

    We address 2 key points of contention in this article. First, we engage the debate concerning whether particular methods are necessarily linked to particular research paradigms. Second, we briefly describe a mixed methods version of grounded theory (MM-GT). Grounded theory can be tailored to work well in any of the 3 major forms of mixed methods…

  19. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    DOEpatents

    Sanfilippo, Antonio P.; Cowell, Andrew J.; Gregory, Michelle L.; Baddeley, Robert L.; Paulson, Patrick R.; Tratz, Stephen C.; Hohimer, Ryan E.

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  20. An Empirical Comparison of Variable Standardization Methods in Cluster Analysis.

    PubMed

    Schaffer, C M; Green, P E

    1996-04-01

    It is common practice in marketing research to standardize the columns (to mean zero and unit standard deviation) of a persons by variables data matrix, prior to clustering the entities corresponding to the rows of that matrix. This practice is often followed even when the columns are all expressed in similar units, such as ratings on a 7-point, equal interval scale. This study examines six different ways of standardizing matrix columns and compares them with the null case of no column standardization. The analysis is replicated for ten large-scale data sets, comprising derived importances of conjoint-based attributes. Our findings indicate that the prevailing column standardization practice may be problematic for some kinds of data that marketing researchers use for segmentation. However, we also find that in the background data profiling step, results are reasonably robust to column standardization method. PMID:26801454

  1. A report on the CCNA 2007 professional practice analysis.

    PubMed

    Muckle, Timothy J; Apatov, Nathaniel M; Plaus, Karen

    2009-06-01

    The purpose of this column is to present the results of the 2007 Professional Practice Analysis (PPA) of the field of nurse anesthesia, conducted by the Council on Certification of Nurse Anesthetists. The PPA used survey and rating scale methodologies to collect data regarding the relative emphasis of various aspects of the nurse anesthesia knowledge domain and competencies. A total of 3,805 survey responses were analyzed using the Rasch rating scale model, which aggregates and transforms ordinal (rating scale) responses into linear measures of relative importance and frequency. Summaries of respondent demographics and educational and professional background are provided, as well as descriptions of how the survey results are used to develop test specifications. The results of this analysis provide evidence for the content outline and test specifications (content percentages) and thus serve as a basis of content validation for the National Certification Examination. PMID:19645167

  2. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  3. Aural Image in Practice: A Multicase Analysis of Instrumental Practice in Middle School Learners

    ERIC Educational Resources Information Center

    Oare, Steve

    2016-01-01

    This multiple case study examined six adolescent band students engaged in self-directed practice. The students' practice sessions were videotaped. Students provided verbal reports during their practice and again retrospectively while reviewing their video immediately after practice. Students were asked to discuss their choice of practice…

  4. Testing the quasi-absolute method in photon activation analysis

    NASA Astrophysics Data System (ADS)

    Sun, Z. J.; Wells, D.; Starovoitova, V.; Segebade, C.

    2013-04-01

    In photon activation analysis (PAA), relative methods are widely used because of their accuracy and precision. Absolute methods, which are conducted without any assistance from calibration materials, are seldom applied for the difficulty in obtaining photon flux in measurements. This research is an attempt to perform a new absolute approach in PAA - quasi-absolute method - by retrieving photon flux in the sample through Monte Carlo simulation. With simulated photon flux and database of experimental cross sections, it is possible to calculate the concentration of target elements in the sample directly. The QA/QC procedures to solidify the research are discussed in detail. Our results show that the accuracy of the method for certain elements is close to a useful level in practice. Furthermore, the future results from the quasi-absolute method can also serve as a validation technique for experimental data on cross sections. The quasi-absolute method looks promising.

  5. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    PubMed Central

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or “one-shot,” in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

  6. A deliberate practice approach to teaching phylogenetic analysis.

    PubMed

    Hobbs, F Collin; Johnson, Daniel J; Kearns, Katherine D

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

  7. Two MIS Analysis Methods: An Experimental Comparison.

    ERIC Educational Resources Information Center

    Wang, Shouhong

    1996-01-01

    In China, 24 undergraduate business students applied data flow diagrams (DFD) to a mini-case, and 20 used object-oriented analysis (OOA). DFD seemed easier to learn, but after training, those using the OOA method for systems analysis made fewer errors. (SK)

  8. Theory, Method and Practice of Neuroscientific Findings in Science Education

    ERIC Educational Resources Information Center

    Liu, Chia-Ju; Chiang, Wen-Wei

    2014-01-01

    This report provides an overview of neuroscience research that is applicable for science educators. It first offers a brief analysis of empirical studies in educational neuroscience literature, followed by six science concept learning constructs based on the whole brain theory: gaining an understanding of brain function; pattern recognition and…

  9. Theory, Method and Practice of Neuroscientific Findings in Science Education

    ERIC Educational Resources Information Center

    Liu, Chia-Ju; Chiang, Wen-Wei

    2014-01-01

    This report provides an overview of neuroscience research that is applicable for science educators. It first offers a brief analysis of empirical studies in educational neuroscience literature, followed by six science concept learning constructs based on the whole brain theory: gaining an understanding of brain function; pattern recognition and

  10. Articulating current service development practices: a qualitative analysis of eleven mental health projects

    PubMed Central

    2014-01-01

    Background The utilisation of good design practices in the development of complex health services is essential to improving quality. Healthcare organisations, however, are often seriously out of step with modern design thinking and practice. As a starting point to encourage the uptake of good design practices, it is important to understand the context of their intended use. This study aims to do that by articulating current health service development practices. Methods Eleven service development projects carried out in a large mental health service were investigated through in-depth interviews with six operation managers. The critical decision method in conjunction with diagrammatic elicitation was used to capture descriptions of these projects. Stage-gate design models were then formed to visually articulate, classify and characterise different service development practices. Results Projects were grouped into three categories according to design process patterns: new service introduction and service integration; service improvement; service closure. Three common design stages: problem exploration, idea generation and solution evaluation - were then compared across the design process patterns. Consistent across projects were a top-down, policy-driven approach to exploration, underexploited idea generation and implementation-based evaluation. Conclusions This study provides insight into where and how good design practices can contribute to the improvement of current service development practices. Specifically, the following suggestions for future service development practices are made: genuine user needs analysis for exploration; divergent thinking and innovative culture for idea generation; and fail-safe evaluation prior to implementation. Better training for managers through partnership working with design experts and researchers could be beneficial. PMID:24438471

  11. Evaluating participatory decision processes: which methods inform reflective practice?

    PubMed

    Kaufman, Sanda; Ozawa, Connie P; Shmueli, Deborah F

    2014-02-01

    Evaluating participatory decision processes serves two key purposes: validating the usefulness of specific interventions for stakeholders, interveners and funders of conflict management processes, and improving practice. However, evaluation design remains challenging, partly because when attempting to serve both purposes we may end up serving neither well. In fact, the better we respond to one, the less we may satisfy the other. Evaluations tend to focus on endogenous factors (e.g., stakeholder selection, BATNAs, mutually beneficial tradeoffs, quality of the intervention, etc.), because we believe that the success of participatory decision processes hinges on them, and they also seem to lend themselves to caeteris paribus statistical comparisons across cases. We argue that context matters too and that contextual differences among specific cases are meaningful enough to undermine conclusions derived solely from comparisons of process-endogenous factors implicitly rooted in the caeteris paribus assumption. We illustrate this argument with an environmental mediation case. We compare data collected about it through surveys geared toward comparability across cases to information elicited through in-depth interviews geared toward case specifics. The surveys, designed by the U.S. Institute of Environmental Conflict Resolution, feed a database of environmental conflicts that can help make the (statistical) case for intervention in environmental conflict management. Our interviews elicit case details - including context - that enable interveners to link context specifics and intervention actions to outcomes. We argue that neither approach can "serve both masters." PMID:24121657

  12. Triangle area water supply monitoring project, October 1988 through September 2001, North Carolina -- description of the water-quality network, sampling and analysis methods, and quality-assurance practices

    USGS Publications Warehouse

    Oblinger, Carolyn J.

    2004-01-01

    The Triangle Area Water Supply Monitoring Project was initiated in October 1988 to provide long-term water-quality data for six area water-supply reservoirs and their tributaries. In addition, the project provides data that can be used to determine the effectiveness of large-scale changes in water-resource management practices, document differences in water quality among water-supply types (large multiuse reservoir, small reservoir, run-of-river), and tributary-loading and in-lake data for water-quality modeling of Falls and Jordan Lakes. By September 2001, the project had progressed in four phases and included as many as 34 sites (in 1991). Most sites were sampled and analyzed by the U.S. Geological Survey. Some sites were already a part of the North Carolina Division of Water Quality statewide ambient water-quality monitoring network and were sampled by the Division of Water Quality. The network has provided data on streamflow, physical properties, and concentrations of nutrients, major ions, metals, trace elements, chlorophyll, total organic carbon, suspended sediment, and selected synthetic organic compounds. Project quality-assurance activities include written procedures for sample collection, record management and archive, collection of field quality-control samples (blank samples and replicate samples), and monitoring the quality of field supplies. In addition to project quality-assurance activities, the quality of laboratory analyses was assessed through laboratory quality-assurance practices and an independent laboratory quality-control assessment provided by the U.S. Geological Survey Branch of Quality Systems through the Blind Inorganic Sample Project and the Organic Blind Sample Project.

  13. An Effective and Practical Method for Solving Hydro-Thermal Unit Commitment Problems Based on Lagrangian Relaxation Method

    NASA Astrophysics Data System (ADS)

    Sakurai, Takayoshi; Kusano, Takashi; Saito, Yutaka; Hirato, Kota; Kato, Masakazu; Murai, Masahiko; Nagata, Junichi

    This paper presents an effective and practical method based on the Lagrangian relaxation method for solving hydro-thermal unit commitment problem in which operational constraints involve spinning reserve requirements for thermal units and prohibition of simultaneous unit start-up/shut-down at the same plant. This method is processed in each iteration step of LRM that enables a direct solution. To improve convergence, this method applies an augmented Lagrangian relaxation method. Its effectiveness demonstrated for a real power system.

  14. The AB Initio Mia Method: Theoretical Development and Practical Applications

    NASA Astrophysics Data System (ADS)

    Peeters, Anik

    The bottleneck in conventional ab initio Hartree -Fock calculations is the storage of the electron repulsion integrals because their number increases with the fourth power of the number of basis functions. This problem can be solved by a combination of the multiplicative integral approximation (MIA) and the direct SCF method. The MIA approach was successfully applied in the geometry optimisation of some biologically interesting compounds like the neurolepticum Haloperidol and two TIBO derivatives, inactivators of HIV1. In this thesis the potency of the MIA-method is shown by the application of this method in the calculation of the forces on the nuclei. In addition, the MIA method enabled the development of a new model for performing crystal field studies: the supermolecule model. The results for this model are in better agreement with experimental data than the results for the point charge model. This is illustrated by the study of some small molecules in the solid state: 2,3-diketopiperazine, formamide oxime and two polymorphic forms of glycine, alpha-glycine and beta-glycine.

  15. Methods in Educational Research: From Theory to Practice

    ERIC Educational Resources Information Center

    Lodico, Marguerite G.; Spaulding Dean T.; Voegtle, Katherine H.

    2006-01-01

    Written for students, educators, and researchers, "Methods in Educational Research" offers a refreshing introduction to the principles of educational research. Designed for the real world of educational research, the book's approach focuses on the types of problems likely to be encountered in professional experiences. Reflecting the importance of…

  16. Preparing Special Education Teacher Candidates: Extending Case Method to Practice

    ERIC Educational Resources Information Center

    Lengyel, Linda; Vernon-Dotson, Lisa

    2010-01-01

    Case methodology is receiving more recognition in the field of education as a viable pedagogy for use in the preparation of future educators. In this article, the coauthors explore two examples of case method instruction that extend beyond university classrooms to field sites: case report and case study. Both examples were used in special…

  17. Practical method of diffusion-welding steel plate in air

    NASA Technical Reports Server (NTRS)

    Holko, K. H.; Moore, T. J.

    1971-01-01

    Method is ideal for critical service requirements where parent metal properties are equaled in notch toughness, stress rupture and other characteristics. Welding technique variations may be used on a variety of materials, such as carbon steels, alloy steels, stainless steels, ceramics, and reactive and refractory materials.

  18. Practice and Progression in Second Language Research Methods

    ERIC Educational Resources Information Center

    Mackey, Alison

    2014-01-01

    Since its inception, the field of second language research has utilized methods from a number of areas, including general linguistics, psychology, education, sociology, anthropology and, recently, neuroscience and corpus linguistics. As the questions and objectives expand, researchers are increasingly pushing methodological boundaries to gain a

  19. Practice and Progression in Second Language Research Methods

    ERIC Educational Resources Information Center

    Mackey, Alison

    2014-01-01

    Since its inception, the field of second language research has utilized methods from a number of areas, including general linguistics, psychology, education, sociology, anthropology and, recently, neuroscience and corpus linguistics. As the questions and objectives expand, researchers are increasingly pushing methodological boundaries to gain a…

  20. Imaging Laser Analysis of Building MATERIALS—PRACTICAL Examples

    NASA Astrophysics Data System (ADS)

    Wilsch, G.; Schaurich, D.; Wiggenhauser, H.

    2011-06-01

    The Laser induced Breakdown Spectroscopy (LIBS) is supplement and extension of standard chemical methods and SEM- or Micro-RFA-applications for the evaluation of building materials. As a laboratory method LIBS is used to gain color coded images representing composition, distribution of characteristic ions and/or ingress characteristic of damaging substances. To create a depth profile of element concentration a core has to be taken and split along the core axis. LIBS was proven to be able to detect all important elements in concrete, e. g. Chlorine, Sodium or Sulfur, which are responsible for certain degradation mechanisms and also light elements like lithium or hydrogen. Practical examples are given and a mobile system for on-site measurements is presented.

  1. Imaging laser analysis of building materials - practical examples

    SciTech Connect

    Wilsch, G.; Schaurich, D.; Wiggenhauser, H.

    2011-06-23

    The Laser induced Breakdown Spectroscopy (LIBS) is supplement and extension of standard chemical methods and SEM- or Micro-RFA-applications for the evaluation of building materials. As a laboratory method LIBS is used to gain color coded images representing composition, distribution of characteristic ions and/or ingress characteristic of damaging substances. To create a depth profile of element concentration a core has to be taken and split along the core axis. LIBS was proven to be able to detect all important elements in concrete, e. g. Chlorine, Sodium or Sulfur, which are responsible for certain degradation mechanisms and also light elements like lithium or hydrogen. Practical examples are given and a mobile system for on-site measurements is presented.

  2. Methods for Analysis of Outdoor Performance Data (Presentation)

    SciTech Connect

    Jordan, D.

    2011-02-01

    The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted to power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders--utility companies, integrators, investors, and scientists alike. Different methods to determine degradation rates and discrete versus continuous data are presented, and some general best practice methods are outlined. In addition, historical degradation rates and some preliminary analysis with respect to climate are given.

  3. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2002-01-01

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  4. Engaging Direct Care Providers in Improving Infection Prevention and Control Practices Using Participatory Visual Methods.

    PubMed

    Backman, Chantal; Bruce, Natalie; Marck, Patricia; Vanderloo, Saskia

    2016-01-01

    The purpose of this quality improvement project was to determine the feasibility of using provider-led participatory visual methods to scrutinize 4 hospital units' infection prevention and control practices. Methods included provider-led photo walkabouts, photo elicitation sessions, and postimprovement photo walkabouts. Nurses readily engaged in using the methods to examine and improve their units' practices and reorganize their work environment. PMID:26681499

  5. Practical flight test method for determining reciprocating engine cooling requirements

    NASA Technical Reports Server (NTRS)

    Ward, D. T.; Miley, S. J.

    1984-01-01

    It is pointed out that efficient and effective cooling of air-cooled reciprocating aircraft engines is a continuing problem for the general aviation industry. Miley et al. (1981) have reported results of a study regarding the controlling variables for cooling and installation aerodynamics. The present investigation is concerned with experimental methods which were developed to determine cooling requirements of an instrumented prototype or production aircraft, taking into account a flight test procedure which has been refined and further verified with additional testing. It is shown that this test procedure represents a straightforward means of determining cooling requirements with minimal instrumentation. Attention is given to some background information, the development history of the NACA cooling correlation method, and the proposed modification of the NACA cooling correlation.

  6. Theories, methods, and practice on the National Atlases of China

    NASA Astrophysics Data System (ADS)

    Qi, Qingwen

    2007-06-01

    The history of editing National Atlases in the world was summarized at first, and follows with China's achievements in editing of the 1st and 2nd version of The National Atlases of China (NAC), which reflected, in multiple levels, China's development of science and technology, society and economy, resources and environment, etc. from 1950s to 1980s. From the previous edition of NAC, systematic theories and methods were summarized and concluded, including comprehensive and statistical mapping theory, designing principle of electronic atlases, and new method the technologies involved in NAC. Then, the New Century Edition of NAC is designed, including its orientation, technological system, volume arrangement, and key scientific and technological problems to be resolved.

  7. Professional Suitability for Social Work Practice: A Factor Analysis

    ERIC Educational Resources Information Center

    Tam, Dora M. Y.; Coleman, Heather; Boey, Kam-Wing

    2012-01-01

    Objective: The purpose of this study was to identify the underlying dimensions of professional suitability. Method: Data were collected from a province-wide mail-out questionnaire surveying 341 participants from a random sample of registered social workers. Results: The use of an exploratory factor analysis identified a 5-factor solution on…

  8. Professional Suitability for Social Work Practice: A Factor Analysis

    ERIC Educational Resources Information Center

    Tam, Dora M. Y.; Coleman, Heather; Boey, Kam-Wing

    2012-01-01

    Objective: The purpose of this study was to identify the underlying dimensions of professional suitability. Method: Data were collected from a province-wide mail-out questionnaire surveying 341 participants from a random sample of registered social workers. Results: The use of an exploratory factor analysis identified a 5-factor solution on

  9. Dynamic mechanical analysis: A practical introduction to techniques and applications

    SciTech Connect

    Menard, K.

    1999-01-01

    This book introduces DMA, its history, and its current position as part of thermal analysis on polymers. It discusses major types of instrumentation, including oscillatory rotational, oscillatory axial, and torsional pendulum. It also describes analytical techniques in terms of utility, quality of data, methods of calibration, and suitability for different types of materials and assesses applications for thermoplastics, thermosetting systems, and thermosets.

  10. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  11. Method of analysis and quality-assurance practices by the U.S. Geological Survey Organic Geochemistry Research Group; determination of geosmin and methylisoborneol in water using solid-phase microextraction and gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Zimmerman, L.R.; Ziegler, A.C.; Thurman, E.M.

    2002-01-01

    A method for the determination of two common odor-causing compounds in water, geosmin and 2-methylisoborneol, was modified and verified by the U.S. Geological Survey's Organic Geochemistry Research Group in Lawrence, Kansas. The optimized method involves the extraction of odor-causing compounds from filtered water samples using a divinylbenzene-carboxen-polydimethylsiloxane cross-link coated solid-phase microextraction (SPME) fiber. Detection of the compounds is accomplished using capillary-column gas chromatography/mass spectrometry (GC/MS). Precision and accuracy were demonstrated using reagent-water, surface-water, and ground-water samples. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 35 nanograms per liter ranged from 60 to 123 percent for geosmin and from 90 to 96 percent for 2-methylisoborneol. Method detection limits were 1.9 nanograms per liter for geosmin and 2.0 nanograms per liter for 2-methylisoborneol in 45-milliliter samples. Typically, concentrations of 30 and 10 nanograms per liter of geosmin and 2-methylisoborneol, respectively, can be detected by the general public. The calibration range for the method is equivalent to concentrations from 5 to 100 nanograms per liter without dilution. The method is valuable for acquiring information about the production and fate of these odor-causing compounds in water.

  12. Updating a Meta-Analysis of Intervention Research with Challenging Behaviour: Treatment Validity and Standards of Practice

    ERIC Educational Resources Information Center

    Harvey, Shane T.; Boer, Diana; Meyer, Luanna H.; Evans, Ian M.

    2009-01-01

    Background: This meta-analysis of interventions with challenging behaviour in children with disabilities updates a comprehensive meta-analysis that previously addressed reported standards of practice and effectiveness of different strategies. Method: Four effect-size algorithms were calculated for published intervention cases, and results analysed…

  13. Bioanalytical methods for food contaminant analysis.

    PubMed

    Van Emon, Jeanette M

    2010-01-01

    Foods are complex mixtures of lipids, carbohydrates, proteins, vitamins, organic compounds, and other naturally occurring substances. Sometimes added to this mixture are residues of pesticides, veterinary and human drugs, microbial toxins, preservatives, contaminants from food processing and packaging, and other residues. This milieu of compounds can pose difficulties in the analysis of food contaminants. There is an expanding need for rapid and cost-effective residue methods for difficult food matrixes to safeguard our food supply. Bioanalytical methods are established for many food contaminants such as mycotoxins and are the method of choice for many food allergens. Bioanalytical methods are often more cost-effective and sensitive than instrumental procedures. Recent developments in bioanalytical methods may provide more applications for their use in food analysis. PMID:21313795

  14. Method and apparatus for ceramic analysis

    DOEpatents

    Jankowiak, Ryszard J.; Schilling, Chris; Small, Gerald J.; Tomasik, Piotr

    2003-04-01

    The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.

  15. Practical method for diffusion welding of steel plate in air.

    NASA Technical Reports Server (NTRS)

    Moore, T. J.; Holko, K. H.

    1972-01-01

    Description of a simple and easily applied method of diffusion welding steel plate in air which does not require a vacuum furnace or hot press. The novel feature of the proposed welding method is that diffusion welds are made in air with deadweight loading. In addition, the use of an autogenous (self-generated) surface-cleaning principle (termed 'auto-vac cleaning') to reduce the effects of surface oxides that normally hinder diffusion welding is examined. A series of nine butt joints were diffusion welded in thick sections of AISI 1020 steel plate. Diffusion welds were attempted at three welding temperatures (1200, 1090, and 980 C) using a deadweight pressure of 34,500 N/sq m (5 psi) and a two-hour hold time at temperature. Auto-vac cleaning operations prior to welding were also studied for the same three temperatures. Results indicate that sound welds were produced at the two higher temperatures when the joints were previously fusion seal welded completely around the periphery. Also, auto-vac cleaning at 1200 C for 2-1/2 hours prior to diffusion welding was highly beneficial, particularly when subsequent welding was accomplished at 1090 C.

  16. Practical methods for competing risks data: a review.

    PubMed

    Bakoyannis, Giorgos; Touloumi, Giota

    2012-06-01

    Competing risks data arise naturally in medical research, when subjects under study are at risk of more than one mutually exclusive event such as death from different causes. The competing risks framework also includes settings where different possible events are not mutually exclusive but the interest lies on the first occurring event. For example, in HIV studies where seropositive subjects are receiving highly active antiretroviral therapy (HAART), treatment interruption and switching to a new HAART regimen act as competing risks for the first major change in HAART. This article introduces competing risks data and critically reviews the widely used statistical methods for estimation and modelling of the basic (estimable) quantities of interest. We discuss the increasingly popular Fine and Gray model for subdistribution hazard of interest, which can be readily fitted using standard software under the assumption of administrative censoring. We present a simulation study, which explores the robustness of inference for the subdistribution hazard to the assumption of administrative censoring. This shows a range of scenarios within which the strictly incorrect assumption of administrative censoring has a relatively small effect on parameter estimates and confidence interval coverage. The methods are illustrated using data from HIV-1 seropositive patients from the collaborative multicentre study CASCADE (Concerted Action on SeroConversion to AIDS and Death in Europe). PMID:21216803

  17. Skill analysis part 3: improving a practice skill.

    PubMed

    Price, Bob

    In this, the third and final article in a series on practice skill analysis, attention is given to imaginative ways of improving a practice skill. Having analysed and evaluated a chosen skill in the previous two articles, it is time to look at new ways to proceed. Creative people are able to be analytical and imaginative. The process of careful reasoning involved in analysing and evaluating a skill will not necessarily be used to improve it. To advance a skill, there is a need to engage in more imaginative, free-thinking processes that allow the nurse to think afresh about his or her chosen skill. Suggestions shared in this article are not exhaustive, but the material presented does illustrate measures that in the author's experience seem to have potential. Consideration is given to how the improved skill might be envisaged (an ideal skill in use). The article is illustrated using the case study of empathetic listening, which has been used throughout this series. PMID:22356066

  18. Documenting Elementary Teachers' Sustainability of Instructional Practices: A Mixed Method Case Study

    NASA Astrophysics Data System (ADS)

    Cotner, Bridget A.

    School reform programs focus on making educational changes; however, research on interventions past the funded implementation phase to determine what was sustained is rarely done (Beery, Senter, Cheadle, Greenwald, Pearson, et al., 2005). This study adds to the research on sustainability by determining what instructional practices, if any, of the Teaching SMARTRTM professional development program that was implemented from 2005--2008 in elementary schools with teachers in grades third through eighth were continued, discontinued, or adapted five years post-implementation (in 2013). Specifically, this study sought to answer the following questions: What do teachers who participated in Teaching SMARTRTM and district administrators share about the sustainability of Teaching SMARTRTM practices in 2013? What teaching strategies do teachers who participated in the program (2005--2008) use in their science classrooms five years postimplementation (2013)? What perceptions about the roles of females in science, technology, engineering, and mathematics (STEM) do teachers who participated in the program (2005--2008) have five years later (2013)? And, What classroom management techniques do the teachers who participated in the program (2005--2008) use five years post implementation (2013)? A mixed method approach was used to answer these questions. Quantitative teacher survey data from 23 teachers who participated in 2008 and 2013 were analyzed in SAS v. 9.3. Descriptive statistics were reported and paired t-tests were conducted to determine mean differences by survey factors identified from an exploratory factor analysis, principal axis factoring, and parallel analysis conducted with teacher survey baseline data (2005). Individual teacher change scores (2008 and 2013) for identified factors were computed using the Reliable Change Index statistic. Qualitative data consisted of interviews with two district administrators and three teachers who responded to the survey in both years (2008 and 2013). Additionally, a classroom observation was conducted with one of the interviewed teachers in 2013. Qualitative analyses were conducted following the constant comparative method and were facilitated by ATLAS.ti v. 6.2, a qualitative analysis software program. Qualitative findings identified themes at the district level that influenced teachers' use of Teaching SMARTRTM strategies. All the themes were classified as obstacles to sustainability: economic downturn, turnover of teachers and lack of hiring, new reform policies, such as Race to the Top, Student Success Act, Common Core State Standards, and mandated blocks of time for specific content. Results from the survey data showed no statistically significant difference through time in perceived instructional practices except for a perceived decrease in the use of hands-on instructional activities from 2008 to 2013. Analyses conducted at the individual teacher level found change scores were statistically significant for a few teachers, but overall, teachers reported similarly on the teacher survey at both time points. This sustainability study revealed the lack of facilitating factors to support the continuation of reform practices; however, teachers identified strategies to continue to implement some of the reform practices through time in spite of a number of system-wide obstacles. This sustainability study adds to the literature by documenting obstacles to sustainability in this specific context, which overlap with what is known in the literature. Additionally, the strategies teachers identified to overcome some of the obstacles to implement reform practices and the recommendations by district level administrators add to the literature on how stakeholders may support sustainability of reform through time.

  19. FACTERA: a practical method for the discovery of genomic rearrangements at breakpoint resolution

    PubMed Central

    Newman, Aaron M.; Bratman, Scott V.; Stehr, Henning; Lee, Luke J.; Liu, Chih Long; Diehn, Maximilian; Alizadeh, Ash A.

    2014-01-01

    Summary: For practical and robust de novo identification of genomic fusions and breakpoints from targeted paired-end DNA sequencing data, we developed Fusion And Chromosomal Translocation Enumeration and Recovery Algorithm (FACTERA). Our method has minimal external dependencies, works directly on a preexisting Binary Alignment/Map file and produces easily interpretable output. We demonstrate FACTERA’s ability to rapidly identify breakpoint-resolution fusion events with high sensitivity and specificity in patients with non-small cell lung cancer, including novel rearrangements. We anticipate that FACTERA will be broadly applicable to the discovery and analysis of clinically relevant fusions from both targeted and genome-wide sequencing datasets. Availability and implementation: http://factera.stanford.edu. Contact: arasha@stanford.edu or diehn@stanford.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25143292

  20. Practical Methods for Locating Abandoned Wells in Populated Areas

    SciTech Connect

    Veloski, G.A.; Hammack, R.W.; Lynn, R.J.

    2007-09-01

    An estimated 12 million wells have been drilled during the 150 years of oil and gas production in the United States. Many old oil and gas fields are now populated areas where the presence of improperly plugged wells may constitute a hazard to residents. Natural gas emissions from wells have forced people from their houses and businesses and have caused explosions that injured or killed people and destroyed property. To mitigate this hazard, wells must be located and properly plugged, a task made more difficult by the presence of houses, businesses, and associated utilities. This paper describes well finding methods conducted by the National Energy Technology Laboratory (NETL) that were effective at two small towns in Wyoming and in a suburb of Pittsburgh, Pennsylvania.

  1. Perceived Barriers and Facilitators to School Social Work Practice: A Mixed-Methods Study

    ERIC Educational Resources Information Center

    Teasley, Martell; Canifield, James P.; Archuleta, Adrian J.; Crutchfield, Jandel; Chavis, Annie McCullough

    2012-01-01

    Understanding barriers to practice is a growing area within school social work research. Using a convenience sample of 284 school social workers, this study replicates the efforts of a mixed-method investigation designed to identify barriers and facilitators to school social work practice within different geographic locations. Time constraints and…

  2. Autoethnography as a Method for Reflexive Research and Practice in Vocational Psychology

    ERIC Educational Resources Information Center

    McIlveen, Peter

    2008-01-01

    This paper overviews the qualitative research method of autoethnography and its relevance to research in vocational psychology and practice in career development. Autoethnography is a reflexive means by which the researcher-practitioner consciously embeds himself or herself in theory and practice, and by way of intimate autobiographic account,…

  3. Cross-Continental Reflections on Evaluation Practice: Methods, Use, and Valuing

    ERIC Educational Resources Information Center

    Kallemeyn, Leanne M.; Hall, Jori; Friche, Nanna; McReynolds, Clifton

    2015-01-01

    The evaluation theory tree typology reflects the following three components of evaluation practice: (a) methods, (b) use, and (c) valuing. The purpose of this study was to explore how evaluation practice is conceived as reflected in articles published in the "American Journal of Evaluation" ("AJE") and "Evaluation," a…

  4. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2004-03-23

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  5. Recruitment ad analysis offers new opportunities to attract GPs to short-staffed practices.

    PubMed

    Hemphill, Elizabeth; Kulik, Carol T

    2013-01-01

    As baby-boomer practitioners exit the workforce, physician shortages present new recruitment challenges for practices seeking GPs. This article reports findings from two studies examining GP recruitment practice. GP recruitment ad content analysis (Study 1) demonstrated that both Internet and print ads emphasize job attributes but rarely present family or practice attributes. Contacts at these medical practices reported that their practices offer distinctive family and practice attributes that could be exploited in recruitment advertising (Study 2). Understaffed medical practices seeking to attract GPs may differentiate their job offerings in a crowded market by incorporating family and/or practice attributes into their ads. PMID:23697854

  6. A mixed methods exploration of the team and organizational factors that may predict new graduate nurse engagement in collaborative practice.

    PubMed

    Pfaff, Kathryn A; Baxter, Pamela E; Ploeg, Jenny; Jack, Susan M

    2014-03-01

    Although engagement in collaborative practice is reported to support the role transition and retention of new graduate (NG) nurses, it is not known how to promote collaborative practice among these nurses. This mixed methods study explored the team and organizational factors that may predict NG nurse engagement in collaborative practice. A total of 514 NG nurses from Ontario, Canada completed the Collaborative Practice Assessment Tool. Sixteen NG nurses participated in follow-up interviews. The team and organizational predictors of NG engagement in collaborative practice were as follows: satisfaction with the team (β = 0.278; p = 0.000), number of team strategies (β = 0.338; p = 0.000), participation in a mentorship or preceptorship experience (β = 0.137; p = 0.000), accessibility of manager (β = 0.123; p = 0.001), and accessibility and proximity of educator or professional practice leader (β = 0.126; p = 0.001 and β = 0.121; p = 0.002, respectively). Qualitative analysis revealed the team facilitators to be respect, team support and face-to-face interprofessional interactions. Organizational facilitators included supportive leadership, participation in a preceptorship or mentorship experience and time. Interventions designed to facilitate NG engagement in collaborative practice should consider these factors. PMID:24195680

  7. Advanced analysis methods in particle physics

    SciTech Connect

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  8. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2009-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  9. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2010-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  10. Validation of analytical methods in compliance with good manufacturing practice: a practical approach

    PubMed Central

    2013-01-01

    Background The quality and safety of cell therapy products must be maintained throughout their production and quality control cycle, ensuring their final use in the patient. We validated the Lymulus Amebocyte Lysate (LAL) test and immunophenotype according to International Conference on Harmonization Q2 Guidelines and the EU Pharmacopoeia, considering accuracy, precision, repeatability, linearity and range. Methods For the endotoxin test we used a kinetic chromogenic LAL test. As this is a limit test for the control of impurities, in compliance with International Conference on Harmonization Q2 Guidelines and the EU Pharmacopoeia, we evaluated the specificity and detection limit. For the immunophenotype test, an identity test, we evaluated specificity through the Fluorescence Minus One method and we repeated all experiments thrice to verify precision. The immunophenotype validation required a performance qualification of the flow cytometer using two types of standard beads which have to be used daily to check cytometer reproducibly set up. The results were compared together. Collected data were statistically analyzed calculating mean, standard deviation and coefficient of variation percentage (CV%). Results The LAL test is repeatable and specific. The spike recovery value of each sample was between 0.25 EU/ml and 1 EU/ml with a CV% < 10%. The correlation coefficient (≥ 0.980) and CV% (< 10%) of the standard curve tested in duplicate showed the test's linearity and a minimum detectable concentration value of 0.005 EU/ml. The immunophenotype method performed thrice on our cell therapy products is specific and repeatable as showed by CV% inter -experiment < 10%. Conclusions Our data demonstrated that validated analytical procedures are suitable as quality controls for the batch release of cell therapy products. Our paper could offer an important contribution for the scientific community in the field of CTPs, above all to small Cell Factories such as ours, where it is not always possible to have CFR21 compliant software. PMID:23981284

  11. Low hardness organisms: Culture methods, sensitivities, and practical applications

    SciTech Connect

    DaCruz, A.; DaCruz, N.; Bird, M.

    1995-12-31

    EPA Regulations require biomonitoring of permitted effluent and stormwater runoff. Several permit locations were studied, in Virginia, that have supply water and or stormwater runoff which ranges in hardness from 5--30 mg/L. Ceriodaphnia dubia (dubia) and Pimephales promelas (fathead minnow) were tested in reconstituted water with hardnesses from 5--30 mg/L. Results indicated osmotic stresses present in the acute tests with the fathead minnow as well as chronic tests for the dubia and the fathead minnow. Culture methods were developed for both organism types in soft (30 mg) reconstituted freshwater. Reproductivity and development for each organisms type meets or exceeds EPA testing requirements for moderately hard organisms. Sensitivities were measured over an 18 month interval using cadmium chloride as a reference toxicant. Additionally, sensitivities were charted in contrast with those of organisms cultured in moderately hard water. The comparison proved that the sensitivities of both the dubia and the fathead minnow cultured in 30 mg water increased, but were within two standard deviations of the organism sensitivities of those cultured in moderately hard water. Latitude for use of organisms cultured in 30 mg was documented for waters ranging in hardness from 10--100 mg/L with no acclimation period required. The stability of the organism sensitivity was also validated. The application was most helpful in stormwater runoff and in effluents where the hardness was 30 mg/L or less.

  12. [Analysis of an intercultural clinical practice in a judicial setting].

    PubMed

    Govindama, Yolande

    2007-01-01

    This article analyses an intercultural clinical practice in a legal setting from an anthropological and psychoanalytical perspective, demonstrating necessary reorganizations inherent to the framework. The culture of the new country and its founding myth being implicit to the judicial framework, the professional intervening introduces psychoanalytical references particularly totemic principles and the symbolic father by making genealogy, a universal object of transmission as guarantee of fundamental taboos of humanity. The metacultural perspective in this approach integrates ethnopsychoanalytical principles put forth by Devereux as well as the method although this latter has been adapted to the framework. This approach allows to re-question Devereux's ethnopsychoanalytical principles by opening the debate on the perspective of a psychoanalytical as well as psychiatric. PMID:18253668

  13. Measuring solar reflectance Part II: Review of practical methods

    SciTech Connect

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul

    2010-05-14

    A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23{sup o}], and to within 0.02 for surface slopes up to 12:12 [45{sup o}]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R*{sub g,0}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R*{sub g,0} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R*{sub g,0} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R*{sub g,0} to within about 0.01.

  14. Practical applications of activation analysis and other nuclear techniques

    SciTech Connect

    Lyon, W.S.

    1982-01-01

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of ..gamma.. rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed.

  15. Short communication: Practical issues in implementing volatile metabolite analysis for identifying mastitis pathogens.

    PubMed

    Hettinga, Kasper A; de Bok, Frank A M; Lam, Theo J G M

    2015-11-01

    Several parameters for improving volatile metabolite analysis using headspace gas chromatography-mass spectrometry (GC-MS) analysis of volatile metabolites were evaluated in the framework of identification of mastitis-causing pathogens. Previous research showed that the results of such volatile metabolites analysis were comparable with those based on bacteriological culturing. The aim of this study was to evaluate the effect of several method changes on the applicability and potential implementation of this method in practice. The use of a relatively polar column is advantageous, resulting in a faster and less complex chromatographic setup with a higher resolving power yielding higher-quality data. Before volatile metabolite analysis is applied, a minimum incubation of 8h is advised, as reducing incubation time leads to less reliable pathogen identification. Application of GC-MS remained favorable compared with regular gas chromatography. The complexity and cost of a GC-MS system are such that this limits the application of the method in practice for identification of mastitis-causing pathogens. PMID:26342985

  16. A practical method of estimating standard error of age in the fission track dating method

    USGS Publications Warehouse

    Johnson, N.M.; McGee, V.E.; Naeser, C.W.

    1979-01-01

    A first-order approximation formula for the propagation of error in the fission track age equation is given by PA = C[P2s+P2i+P2??-2rPsPi] 1 2, where PA, Ps, Pi and P?? are the percentage error of age, of spontaneous track density, of induced track density, and of neutron dose, respectively, and C is a constant. The correlation, r, between spontaneous are induced track densities is a crucial element in the error analysis, acting generally to improve the standard error of age. In addition, the correlation parameter r is instrumental is specifying the level of neutron dose, a controlled variable, which will minimize the standard error of age. The results from the approximation equation agree closely with the results from an independent statistical model for the propagation of errors in the fission-track dating method. ?? 1979.

  17. [Methods for grain size analysis of nanomedicines].

    PubMed

    Geng, Zhi-Wang; He, Lan; Zhang, Qi-Ming; Yang, Yong-Jian

    2012-07-01

    As nanomedicines are developing fast in both academic and market areas, building up suitable methods for nanomedicine analysis with proper techniques is an important subject, requiring further research. The techniques, which could be employed for grain size analysis of nanomedicines, were reviewed. Several key techniques were discussed with their principles, scope of applications, advantages and defects. Their applications to nanomedine analysis were discussed according to the properties of different nanomedicines, with the purpose of providing some suggestions for the control and administration of nanomedicines. PMID:22993848

  18. Design analysis, robust methods, and stress classification

    SciTech Connect

    Bees, W.J.

    1993-01-01

    This special edition publication volume is comprised of papers presented at the 1993 ASME Pressure Vessels and Piping Conference, July 25--29, 1993 in Denver, Colorado. The papers were prepared for presentations in technical sessions developed under the auspices of the PVPD Committees on Computer Technology, Design and Analysis, Operations Applications and Components. The topics included are: Analysis of Pressure Vessels and Components; Expansion Joints; Robust Methods; Stress Classification; and Non-Linear Analysis. Individual papers have been processed separately for inclusion in the appropriate data bases.

  19. A practical guide to value of information analysis.

    PubMed

    Wilson, Edward C F

    2015-02-01

    Value of information analysis is a quantitative method to estimate the return on investment in proposed research projects. It can be used in a number of ways. Funders of research may find it useful to rank projects in terms of the expected return on investment from a variety of competing projects. Alternatively, trialists can use the principles to identify the efficient sample size of a proposed study as an alternative to traditional power calculations, and finally, a value of information analysis can be conducted alongside an economic evaluation as a quantitative adjunct to the 'future research' or 'next steps' section of a study write up. The purpose of this paper is to present a brief introduction to the methods, a step-by-step guide to calculation and a discussion of issues that arise in their application to healthcare decision making. Worked examples are provided in the accompanying online appendices as Microsoft Excel spreadsheets. PMID:25336432

  20. Four hour ambulation after angioplasty is a safe practice method

    PubMed Central

    Moeini, Mahin; Moradpour, Fatemeh; Babaei, Sima; Rafieian, Mohsen; Khosravi, Alireza

    2010-01-01

    BACKGROUND: During the last 3 decades, there were increasing tendency towards angioplasty because of its benefits. But, this procedure has its acute problems like bleeding and formation of hematoma in the removal place of the sheet. Based on researchers’ clinical experiences, patients need a time of 8-12 hours for bed rest after coronary angioplasty. Recognizing desirable time for bed rest after angioplasty and remove the arterial sheet forms the foundation of related researches in the world. Getting out of bed soon after angioplasty, causes more comfortable feelings, less hospitalization period, fewer side effects of prolonged bed rest and less hospitalization expenses. Regarding less time for bed rest after angioplasty, the aim of this study was to assess the effect of the time of getting out of bed after angioplasty on the complications after removing the sheet in coronary angioplasty patients. METHODS: This was an experimental clinical study conducted in one step and two groups. Samples were included 124 angioplasty patients (62 in each group) who were chosen randomly from the CCU of Shahid Chamran hospital of the Isfahan University of Medical Sciences in 2007. Data were gathered by observing and evaluating the patients, using a questionnaire and a checklist. After angioplasty, patients from the intervention group were taken out of bed in 4 hours and patients from the control group were taken out of bed in 8 hours. After taking out of bed, patients were examined for bleeding and formation of hematoma in the place of taking out the arterial sheet. Data were analyzed using descriptive and inferential statistics via SPSS software. RESULTS: Results showed no meaningful difference between the two groups after getting out of bed (p > 0.05) regarding relative frequency of bleeding (p = 0.50), formation of hematoma (p = 0.34) and average diameter of hematoma (p = 0.39). CONCLUSIONS: Results of this study showed that reducing the bed rest time to 4 hours after removing the arterial sheet of size 7 do not increase bleeding and formation of hematoma in the removal place of the sheet. So, those angioplasty patients who do not have critical clinical condition and their vital symptoms are stabilized will be able to get out of bed 4 hours after removing the sheet. PMID:21589772

  1. A mixed methods approach to understand variation in lung cancer practice and the role of guidelines

    PubMed Central

    2014-01-01

    Introduction Practice pattern data demonstrate regional variation and lower than expected rates of adherence to practice guideline (PG) recommendations for the treatment of stage II/IIIA resected and stage IIIA/IIIB unresected non-small cell lung cancer (NSCLC) patients in Ontario, Canada. This study sought to understand how clinical decisions are made for the treatment of these patients and the role of PGs. Methods Surveys and key informant interviews were undertaken with clinicians and administrators. Results Participants reported favorable ratings for PGs and the evidentiary bases underpinning them. The majority of participants agreed more patients should have received treatment and that regional variation is problematic. Participants estimated that up to 30% of patients are not good candidates for treatment and up to 20% of patients refuse treatment. The most common barrier to implementing PGs was the lack of organizational support by clinical administrative leadership. There was concern that the trial results underpinning the PG recommendations were not generalizable to the typical patients seen in clinic. The qualitative analysis yielded five themes related to physicians’ decision making: the unique patient, the unique physician, the family, the clinical team, and the clinical evidence. A dynamic interplay between these factors exists. Conclusion Our study demonstrates the challenges inherent in (i) the complexity of clinical decision making; (ii) how quality of care problems are perceived and operationalized; and (iii) the clinical appropriateness and utility of PG recommendations. We argue that systematic and rigorous methodologies to help decision makers mitigate or negotiate these challenges are warranted. PMID:24655753

  2. Protein-protein interactions: methods for detection and analysis.

    PubMed Central

    Phizicky, E M; Fields, S

    1995-01-01

    The function and activity of a protein are often modulated by other proteins with which it interacts. This review is intended as a practical guide to the analysis of such protein-protein interactions. We discuss biochemical methods such as protein affinity chromatography, affinity blotting, coimmunoprecipitation, and cross-linking; molecular biological methods such as protein probing, the two-hybrid system, and phage display: and genetic methods such as the isolation of extragenic suppressors, synthetic mutants, and unlinked noncomplementing mutants. We next describe how binding affinities can be evaluated by techniques including protein affinity chromatography, sedimentation, gel filtration, fluorescence methods, solid-phase sampling of equilibrium solutions, and surface plasmon resonance. Finally, three examples of well-characterized domains involved in multiple protein-protein interactions are examined. The emphasis of the discussion is on variations in the approaches, concerns in evaluating the results, and advantages and disadvantages of the techniques. PMID:7708014

  3. Maximizing Return From Sound Analysis and Design Practices

    SciTech Connect

    Bramlette, Judith Lynn

    2002-06-01

    With today's tightening budgets computer applications must provide "true" long-term benefit to the company. Businesses are spending large portions of their budgets "Re- Engineering" old systems to take advantage of "new" technology. But what they are really getting is simply a new interface implementing the same incomplete or poor defined requirements as before. "True" benefit can only be gained if sound analysis and design practices are used. WHAT data and processes are required of a system is not the same as HOW the system will be implemented within a company. It is the System Analyst's responsibility to understand the difference between these two concepts. The paper discusses some simple techniques to be used during the Analysis and Design phases of projects, as well as the information gathered and recorded in each phase and how it is transformed between these phases. The paper also covers production application generated using Oracle Designer. Applying these techniques to "real world" problems, the applications will meet the needs for today's business and adapt easily to ever-changing business environments.

  4. Maximizing Return From Sound Analysis and Design Practices

    SciTech Connect

    Bramlette, J.D.

    2002-04-22

    With today's tightening budgets computer applications must provide ''true'' long-term benefit to the company. Businesses are spending large portions of their budgets ''Re-Engineering'' old systems to take advantage of ''new'' technology. But what they are really getting is simply a new interface implementing the same incomplete or poor defined requirements as before. ''True'' benefit can only be gained if sound analysis and design practices are used. WHAT data and processes are required of a system is not the same as HOW the system will be implemented within a company. It is the System Analyst's responsibility to understand the difference between these two concepts. The paper discusses some simple techniques to be used during the Analysis and Design phases of projects, as well as the information gathered and recorded in each phase and how it is transformed between these phases. The paper also covers production application generated using Oracle Designer. Applying these techniques to ''real world'' problems, the applications will meet the needs for today's business and adapt easily to ever-changing business environments.

  5. A practical guide to environmental association analysis in landscape genomics.

    PubMed

    Rellstab, Christian; Gugerli, Felix; Eckert, Andrew J; Hancock, Angela M; Holderegger, Rolf

    2015-09-01

    Landscape genomics is an emerging research field that aims to identify the environmental factors that shape adaptive genetic variation and the gene variants that drive local adaptation. Its development has been facilitated by next-generation sequencing, which allows for screening thousands to millions of single nucleotide polymorphisms in many individuals and populations at reasonable costs. In parallel, data sets describing environmental factors have greatly improved and increasingly become publicly accessible. Accordingly, numerous analytical methods for environmental association studies have been developed. Environmental association analysis identifies genetic variants associated with particular environmental factors and has the potential to uncover adaptive patterns that are not discovered by traditional tests for the detection of outlier loci based on population genetic differentiation. We review methods for conducting environmental association analysis including categorical tests, logistic regressions, matrix correlations, general linear models and mixed effects models. We discuss the advantages and disadvantages of different approaches, provide a list of dedicated software packages and their specific properties, and stress the importance of incorporating neutral genetic structure in the analysis. We also touch on additional important aspects such as sampling design, environmental data preparation, pooled and reduced-representation sequencing, candidate-gene approaches, linearity of allele-environment associations and the combination of environmental association analyses with traditional outlier detection tests. We conclude by summarizing expected future directions in the field, such as the extension of statistical approaches, environmental association analysis for ecological gene annotation, and the need for replication and post hoc validation studies. PMID:26184487

  6. Displacement Monitoring and Sensitivity Analysis in the Observational Method

    NASA Astrophysics Data System (ADS)

    Górska, Karolina; Muszyński, Zbigniew; Rybak, Jarosław

    2013-09-01

    This work discusses the fundamentals of designing deep excavation support by means of observational method. The effective tools for optimum designing with the use of the observational method are both inclinometric and geodetic monitoring, which provide data for the systematically updated calibration of the numerical computational model. The analysis included methods for selecting data for the design (by choosing the basic random variables), as well as methods for an on-going verification of the results of numeric calculations (e.g., MES) by way of measuring the structure displacement using geodetic and inclinometric techniques. The presented example shows the sensitivity analysis of the calculation model for a cantilever wall in non-cohesive soil; that analysis makes it possible to select the data to be later subject to calibration. The paper presents the results of measurements of a sheet pile wall displacement, carried out by means of inclinometric method and, simultaneously, two geodetic methods, successively with the deepening of the excavation. This work includes also critical comments regarding the usefulness of the obtained data, as well as practical aspects of taking measurement in the conditions of on-going construction works.

  7. Practice.

    PubMed

    Chambers, David W

    2008-01-01

    Practice refers to a characteristic way professionals use common standards to customize solutions to a range of problems. Practice includes (a) standards for outcomes and processes that are shared with one's colleagues, (b) a rich repertoire of skills grounded in diagnostic acumen, (c) an ability to see the actual and the ideal and work back and forth between them, (d) functional artistry, and (e) learning by doing that transcends scientific rationality. Communities of practice, such as dental offices, are small groups that work together in interlocking roles to achieve these ends. PMID:19413050

  8. "Movement Doesn't Lie": Teachers' Practice Choreutical Analysis

    ERIC Educational Resources Information Center

    Pastore, Serafina; Pentassuglia, Monica

    2015-01-01

    Identifying and describing teaching practice is not an easy task. Current educational research aims at explaining teachers' work focusing on the concept of practice. Teachers' practical knowledge is a sensitive and tacit knowledge, produced, and effused by the body. In this perspective, the teachers' work can be considered as an expressive

  9. Systems and methods for sample analysis

    SciTech Connect

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-10-20

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  10. Analysis methods for tocopherols and tocotrienols

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Tocopherols and tocotrienols, sometimes called tocochromanols or tocols, are also collectively termed Vitamin E. Vitamins A, D, E, and K, are referred to as fat soluble vitamins. Since the discovery of Vitamin E in 1922, many methods have been developed for the analysis of tocopherols and tocotrie...

  11. Methods for Chemical Analysis of Fresh Waters.

    ERIC Educational Resources Information Center

    Golterman, H. L.

    This manual, one of a series prepared for the guidance of research workers conducting studies as part of the International Biological Programme, contains recommended methods for the analysis of fresh water. The techniques are grouped in the following major sections: Sample Taking and Storage; Conductivity, pH, Oxidation-Reduction Potential,

  12. Systems and methods for sample analysis

    SciTech Connect

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-01-13

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  13. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M.; Ng, Esmond G.

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  14. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, L.M.; Ng, E.G.

    1998-09-29

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

  15. A Method for Automating Dialect Analysis.

    ERIC Educational Resources Information Center

    Uskup, Frances Land

    This paper proposes a method of handling limited problems in dialect research. In approaching the problem, it was necessary to devise a system for coding phonetic transcription which would take into account the variance in the diacritics of different field workers so that none of the material would be lost while permitting computer analysis. The…

  16. Methods for Chemical Analysis of Fresh Waters.

    ERIC Educational Resources Information Center

    Golterman, H. L.

    This manual, one of a series prepared for the guidance of research workers conducting studies as part of the International Biological Programme, contains recommended methods for the analysis of fresh water. The techniques are grouped in the following major sections: Sample Taking and Storage; Conductivity, pH, Oxidation-Reduction Potential,…

  17. Comparison of three evidence-based practice learning assessment methods in dental curricula.

    PubMed

    Al-Ansari, Asim A; El Tantawi, Maha M A

    2015-02-01

    Incorporating evidence-based practice (EBP) training in dental curricula is now an accreditation requirement for dental schools, but questions remain about the most effective ways to assess learning outcomes. The purpose of this study was to evaluate and compare three assessment methods for EBP training and to assess their relation to students' overall course grades. Participants in the study were dental students from two classes who received training in appraising randomized controlled trials (RCTs) and systematic reviews in 2013 at the University of Dammam, Saudi Arabia. Repeated measures analysis of variance was used to compare students' scores on appraisal assignments, scores on multiple-choice question (MCQ) exams in which EBP concepts were applied to clinical scenarios, and scores for self-reported efficacy in appraisal. Regression analysis was used to assess the relationship among the three assessment methods, gender, program level, and overall grade. The instructors had acceptable reliability in scoring the assignments (overall intraclass correlation coefficient=0.60). The MCQ exams had acceptable discrimination indices although their reliability was less satisfactory (Cronbach's alpha=0.46). Statistically significant differences were observed among the three methods with MCQ exams having the lowest overall scores. Variation in the overall course grades was explained by scores on the appraisal assignment and MCQ exams (partial eta-squared=0.52 and 0.24, respectively), whereas score on the self-efficacy questionnaire was not significantly associated with overall grade. The results suggest that self-reported efficacy is not a valid method to assess dental students' RCT appraisal skills, whereas instructor-graded appraisal assignments explained a greater portion of variation in grade and had inherent validity and acceptable consistency and MCQ exams had good construct validity but low internal consistency. PMID:25640619

  18. Multiple predictor smoothing methods for sensitivity analysis.

    SciTech Connect

    Helton, Jon Craig; Storlie, Curtis B.

    2006-08-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  19. Simplified Analysis Methods for Primary Load Designs at Elevated Temperatures

    SciTech Connect

    Carter, Peter; Jetter, Robert I; Sham, Sam

    2011-01-01

    The use of simplified (reference stress) analysis methods is discussed and illustrated for primary load high temperature design. Elastic methods are the basis of the ASME Section III, Subsection NH primary load design procedure. There are practical drawbacks with this approach, particularly for complex geometries and temperature gradients. The paper describes an approach which addresses these difficulties through the use of temperature-dependent elastic-perfectly plastic analysis. Correction factors are defined to address difficulties traditionally associated with discontinuity stresses, inelastic strain concentrations and multiaxiality. A procedure is identified to provide insight into how this approach could be implemented but clearly there is additional work to be done to define and clarify the procedural steps to bring it to the point where it could be adapted into code language.

  20. Power System Transient Stability Analysis through a Homotopy Analysis Method

    SciTech Connect

    Wang, Shaobu; Du, Pengwei; Zhou, Ning

    2014-04-01

    As an important function of energy management systems (EMSs), online contingency analysis plays an important role in providing power system security warnings of instability. At present, N-1 contingency analysis still relies on time-consuming numerical integration. To save computational cost, the paper proposes a quasi-analytical method to evaluate transient stability through time domain periodic solutions’ frequency sensitivities against initial values. First, dynamic systems described in classical models are modified into damping free systems whose solutions are either periodic or expanded (non-convergent). Second, because the sensitivities experience sharp changes when periodic solutions vanish and turn into expanded solutions, transient stability is assessed using the sensitivity. Third, homotopy analysis is introduced to extract frequency information and evaluate the sensitivities only from initial values so that time consuming numerical integration is avoided. Finally, a simple case is presented to demonstrate application of the proposed method, and simulation results show that the proposed method is promising.

  1. Improving educational environment in medical colleges through transactional analysis practice of teachers

    PubMed Central

    Rajan, Marina

    2012-01-01

    Context:A FAIMER (Foundation for Advancement in International Medical Education and Research) fellow organized a comprehensive faculty development program to improve faculty awareness resulting in changed teaching practices and better teacher student relationships using Transactional Analysis (TA). Practicing TA tools help development of awareness about intrapersonal and interpersonal processes. Objectives: To improve self-awareness among medical educators.To bring about self-directed change in practices among medical educators.To assess usefulness of TA tools for the same. Methods:An experienced trainer conducted a basic course (12 hours) in TA for faculty members. The PAC model of personality structure, functional fluency model of personal functioning, stroke theory on motivation, passivity and script theories of adult functional styles were taught experientially with examples from the Medical Education Scenario. Self-reported improvement in awareness and changes in practices were assessed immediately after, at three months, and one year after training. Findings:The mean improvement in self-'awareness' is 13.3% (95% C.I 9.3-17.2) among nineteen participants. This persists one year after training. Changes in practices within a year include, collecting feedback, new teaching styles and better relationship with students. Discussion and Conclusions:These findings demonstrate sustainable and measurable improvement in self-awareness by practice of TA tools. Improvement in self-'awareness' of faculty resulted in self-directed changes in teaching practices. Medical faculty has judged the TA tools effective for improving self-awareness leading to self-directed changes. PMID:24358808

  2. Graphical methods for the sensitivity analysis in discriminant analysis

    DOE PAGESBeta

    Kim, Youngil; Anderson-Cook, Christine M.; Dae-Heung, Jang

    2015-09-30

    Similar to regression, many measures to detect influential data points in discriminant analysis have been developed. Many follow similar principles as the diagnostic measures used in linear regression in the context of discriminant analysis. Here we focus on the impact on the predicted classification posterior probability when a data point is omitted. The new method is intuitive and easily interpretative compared to existing methods. We also propose a graphical display to show the individual movement of the posterior probability of other data points when a specific data point is omitted. This enables the summaries to capture the overall pattern ofmore » the change.« less

  3. Graphical methods for the sensitivity analysis in discriminant analysis

    SciTech Connect

    Kim, Youngil; Anderson-Cook, Christine M.; Dae-Heung, Jang

    2015-09-30

    Similar to regression, many measures to detect influential data points in discriminant analysis have been developed. Many follow similar principles as the diagnostic measures used in linear regression in the context of discriminant analysis. Here we focus on the impact on the predicted classification posterior probability when a data point is omitted. The new method is intuitive and easily interpretative compared to existing methods. We also propose a graphical display to show the individual movement of the posterior probability of other data points when a specific data point is omitted. This enables the summaries to capture the overall pattern of the change.

  4. Mix of methods is needed to identify adverse events in general practice: A prospective observational study

    PubMed Central

    Wetzels, Raymond; Wolters, René; van Weel, Chris; Wensing, Michel

    2008-01-01

    Background The validity and usefulness of incident reporting and other methods for identifying adverse events remains unclear. This study aimed to compare five methods in general practice. Methods In a prospective observational study, with five general practitioners, five methods were applied and compared. The five methods were physician reported adverse events, pharmacist reported adverse events, patients' experiences of adverse events, assessment of a random sample of medical records, and assessment of all deceased patients. Results A total of 68 events were identified using these methods. The patient survey accounted for the highest number of events and the pharmacist reports for the lowest number. No overlap between the methods was detected. The patient survey accounted for the highest number of events and the pharmacist reports for the lowest number. Conclusion A mix of methods is needed to identify adverse events in general practice. PMID:18554418

  5. Lost to the NHS: a mixed methods study of why GPs leave practice early in England

    PubMed Central

    Doran, Natasha; Fox, Fiona; Rodham, Karen; Taylor, Gordon; Harris, Michael

    2016-01-01

    Background The loss of GPs in the early stages of their careers is contributing to the GP workforce crisis. Recruitment in the UK remains below the numbers needed to support the demand for GP care. Aim To explore the reasons why GPs leave general practice early. Design and setting A mixed methods study using online survey data triangulated with qualitative interviews. Method Participants were GPs aged <50 years who had left the English Medical Performers List in the last 5 years (2009–2014). A total of 143 early GP leavers participated in an online survey, of which 21 took part in recorded telephone interviews. Survey data were analysed using descriptive statistics, and qualitative data using thematic analysis techniques. Results Reasons for leaving were cumulative and multifactorial. Organisational changes to the NHS have led to an increase in administrative tasks and overall workload that is perceived by GP participants to have fundamentally changed the doctor–patient relationship. Lack of time with patients has compromised the ability to practise more patient-centred care, and, with it, GPs’ sense of professional autonomy and values, resulting in diminished job satisfaction. In this context, the additional pressures of increased patient demand and the negative media portrayal left many feeling unsupported and vulnerable to burnout and ill health, and, ultimately, to the decision to leave general practice. Conclusion To improve retention of young GPs, the pace of administrative change needs to be minimised and the time spent by GPs on work that is not face-to-face patient care reduced. PMID:26740606

  6. A mixed methods study of food safety knowledge, practices and beliefs in Hispanic families with young children.

    PubMed

    Stenger, Kristen M; Ritter-Gooder, Paula K; Perry, Christina; Albrecht, Julie A

    2014-12-01

    Children are at a higher risk for foodborne illness. The objective of this study was to explore food safety knowledge, beliefs and practices among Hispanic families with young children (≤10 years of age) living within a Midwestern state. A convergent mixed methods design collected qualitative and quantitative data in parallel. Food safety knowledge surveys were administered (n = 90) prior to exploration of beliefs and practices among six focus groups (n = 52) conducted by bilingual interpreters in community sites in five cities/towns. Descriptive statistics determined knowledge scores and thematic coding unveiled beliefs and practices. Data sets were merged to assess concordance. Participants were female (96%), 35.7 (±7.6) years of age, from Mexico (69%), with the majority having a low education level. Food safety knowledge was low (56% ± 11). Focus group themes were: Ethnic dishes popular, Relating food to illness, Fresh food in home country, Food safety practices, and Face to face learning. Mixed method analysis revealed high self confidence in preparing food safely with low safe food handling knowledge and the presence of some cultural beliefs. On-site Spanish classes and materials were preferred venues for food safety education. Bilingual food safety messaging targeting common ethnic foods and cultural beliefs and practices is indicated to lower the risk of foodborne illness in Hispanic families with young children. PMID:25178898

  7. Spatial dynamics of farming practices in the Seine basin: methods for agronomic approaches on a regional scale.

    PubMed

    Mignolet, C; Schott, C; Benot, M

    2007-04-01

    A research procedure is proposed which aims to analyse the agricultural spatial dynamics during the last thirty years using two levels of organisation of farming activity: the agricultural production system and the cropping system. Based on methods of statistical mapping and data mining, this procedure involves modelling the diversity of production systems and cropping systems (crop successions and sequences of cultural practices for each crop) in the form of classes independently of their localisation within the basin. It identifies homogeneous regions made up of groups of contiguous agricultural districts which exhibit similar combinations of production systems, crop successions or cultural practices during a given period of time. The results show a major increase in arable farms since 1970 at the expense of dairy farms and mixed cropping/livestock. This trend however appeared to be greatly spatially differentiated according to the agricultural districts, since livestock remained important on the edges of the basin, whereas it practically disappeared in its centre. The crop successions practiced in the basin and the cultural practices used on them also appear to be spatially differentiated, although the link to the production systems is not always clear. Thus it appears pertinent to combine the analysis of the two levels of organisation of the agriculture (methods of land use described by the concept of cropping system, and also the production systems into which the cropping systems fit) in the context of an environmental problem. PMID:17316763

  8. Probabilistic Computational Methods in Structural Failure Analysis

    NASA Astrophysics Data System (ADS)

    Krejsa, Martin; Kralik, Juraj

    2015-12-01

    Probabilistic methods are used in engineering where a computational model contains random variables. Each random variable in the probabilistic calculations contains uncertainties. Typical sources of uncertainties are properties of the material and production and/or assembly inaccuracies in the geometry or the environment where the structure should be located. The paper is focused on methods for the calculations of failure probabilities in structural failure and reliability analysis with special attention on newly developed probabilistic method: Direct Optimized Probabilistic Calculation (DOProC), which is highly efficient in terms of calculation time and the accuracy of the solution. The novelty of the proposed method lies in an optimized numerical integration that does not require any simulation technique. The algorithm has been implemented in mentioned software applications, and has been used several times in probabilistic tasks and probabilistic reliability assessments.

  9. Text analysis devices, articles of manufacture, and text analysis methods

    DOEpatents

    Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C

    2013-05-28

    Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.

  10. The plant volatilome: methods of analysis.

    PubMed

    Bicchi, Carlo; Maffei, Massimo

    2012-01-01

    Analysis of plant volatile organic compounds (VOCs) and essential oils (EOs, collectively called the plant volatilome) is an invaluable technique in plant biology, as it provides the qualitative and quantitative composition of bioactive compounds. From a physiological standpoint, the plant volatilome is involved in some critical processes, namely plant-plant interactions, the signaling between symbiotic organisms, the attraction of pollinating insects, a range of biological activities in mammals, and as an endless source of novel drugs and drug leads. This chapter analyses and discusses the most advanced methods of analysis of the plant volatilome. PMID:22893295

  11. Advanced Analysis Methods in High Energy Physics

    SciTech Connect

    Pushpalatha C. Bhat

    2001-10-03

    During the coming decade, high energy physics experiments at the Fermilab Tevatron and around the globe will use very sophisticated equipment to record unprecedented amounts of data in the hope of making major discoveries that may unravel some of Nature's deepest mysteries. The discovery of the Higgs boson and signals of new physics may be around the corner. The use of advanced analysis techniques will be crucial in achieving these goals. The author discusses some of the novel methods of analysis that could prove to be particularly valuable for finding evidence of any new physics, for improving precision measurements and for exploring parameter spaces of theoretical models.

  12. Understanding Online Teacher Best Practices: A Thematic Analysis to Improve Learning

    ERIC Educational Resources Information Center

    Corry, Michael; Ianacone, Robert; Stella, Julie

    2014-01-01

    The purpose of this study was to examine brick-and-mortar and online teacher best practice themes using thematic analysis and a newly developed theory-based analytic process entitled Synthesized Thematic Analysis Criteria (STAC). The STAC was developed to facilitate the meaningful thematic analysis of research based best practices of K-12…

  13. Spectroscopic Chemical Analysis Methods and Apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F.; Reid, Ray D.

    2012-01-01

    This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses ballistic electron beam injection directly into the active region of a wide bandgap semiconductor material.

  14. Methods to enhance compost practices as an alternative to waste disposal

    SciTech Connect

    Stuckey, H.T.; Hudak, P.F.

    1998-12-31

    Creating practices that are ecologically friendly, economically profitable, and ethically sound is a concept that is slowly beginning to unfold in modern society. In developing such practices, the authors challenge long-lived human behavior patterns and environmental management practices. In this paper, they trace the history of human waste production, describe problems associated with such waste, and explore regional coping mechanisms. Composting projects in north central Texas demonstrate new methods for waste disposal. The authors studied projects conducted by municipalities, schools, agricultural organizations, and individual households. These efforts were examined within the context of regional and statewide solid waste plans. They conclude that: (1) regional composting in north central Texas will substantially reduce the waste stream entering landfills; (2) public education is paramount to establishing alternative waste disposal practices; and (3) new practices for compost will catalyze widespread and efficient production.

  15. Comparison of analysis methods for airway quantification

    NASA Astrophysics Data System (ADS)

    Odry, Benjamin L.; Kiraly, Atilla P.; Novak, Carol L.; Naidich, David P.

    2012-03-01

    Diseased airways have been known for several years as a possible contributing factor to airflow limitation in Chronic Obstructive Pulmonary Diseases (COPD). Quantification of disease severity through the evaluation of airway dimensions - wall thickness and lumen diameter - has gained increased attention, thanks to the availability of multi-slice computed tomography (CT). Novel approaches have focused on automated methods of measurement as a faster and more objective means that the visual assessment routinely employed in the clinic. Since the Full-Width Half-Maximum (FWHM) method of airway measurement was introduced two decades ago [1], several new techniques for quantifying airways have been detailed in the literature, but no approach has truly become a standard for such analysis. Our own research group has presented two alternative approaches for determining airway dimensions, one involving a minimum path and the other active contours [2, 3]. With an increasing number of techniques dedicated to the same goal, we decided to take a step back and analyze the differences of these methods. We consequently put to the test our two methods of analysis and the FWHM approach. We first measured a set of 5 airways from a phantom of known dimensions. Then we compared measurements from the three methods to those of two independent readers, performed on 35 airways in 5 patients. We elaborate on the differences of each approach and suggest conclusions on which could be defined as the best one.

  16. Forum discussion on probabilistic structural analysis methods

    SciTech Connect

    Rodriguez, E.A.; Girrens, S.P.

    2000-10-01

    The use of Probabilistic Structural Analysis Methods (PSAM) has received much attention over the past several decades due in part to enhanced reliability theories, computational capabilities, and efficient algorithms. The need for this development was already present and waiting at the door step. Automotive design and manufacturing has been greatly enhanced because of PSAM and reliability methods, including reliability-based optimization. This demand was also present in the US Department of Energy (DOE) weapons laboratories in support of the overarching national security responsibility of maintaining the nations nuclear stockpile in a safe and reliable state.

  17. Measurement methods for human exposure analysis.

    PubMed Central

    Lioy, P J

    1995-01-01

    The general methods used to complete measurements of human exposures are identified and illustrations are provided for the cases of indirect and direct methods used for exposure analysis. The application of the techniques for external measurements of exposure, microenvironmental and personal monitors, are placed in the context of the need to test hypotheses concerning the biological effects of concern. The linkage of external measurements to measurements made in biological fluids is explored for a suite of contaminants. This information is placed in the context of the scientific framework used to conduct exposure assessment. Examples are taken from research on volatile organics and for a large scale problem: hazardous waste sites. PMID:7635110

  18. Female genital cutting and other intra-vaginal practices: implications for TwoDay Method use.

    PubMed

    Aksel, Sarp; Sinai, Irit; Yee, Kimberly Aumack

    2012-09-01

    This report examines the implications of female genital cutting and other intra-vaginal practices for offering the TwoDay Method® of family planning.This fertility awareness-based method relies on the identification of cervicovaginal secretions to identify the fertile window. Female genital cutting and traditional vaginal practices, such as the use of desiccants, may affect the presence or absence of secretions and therefore the woman’s perception of her fertility. These issues and their implications for service delivery of the method are discussed. PMID:23016158

  19. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    SciTech Connect

    Ronald Laurids Boring

    2010-11-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  20. Thermal Analysis of AC Contactor Using Thermal Network Finite Difference Analysis Method

    NASA Astrophysics Data System (ADS)

    Niu, Chunping; Chen, Degui; Li, Xingwen; Geng, Yingsan

    To predict the thermal behavior of switchgear quickly, the Thermal Network Finite Difference Analysis method (TNFDA) is adopted in thermal analysis of AC contactor in the paper. The thermal network model is built with nodes, thermal resistors and heat generators, and it is solved using finite difference method (FDM). The main circuit and the control system are connected by thermal resistors network, which solves the problem of multi-sources interaction in the application of TNFDA. The temperature of conducting wires is calculated according to the heat transfer process and the fundamental equations of thermal conduction. It provides a method to solve the problem of boundary conditions in applying the TNFDA. The comparison between the results of TNFDA and measurements shows the feasibility and practicability of the method.

  1. Best Practices in Teaching Statistics and Research Methods in the Behavioral Sciences [with CD-ROM

    ERIC Educational Resources Information Center

    Dunn, Dana S., Ed.; Smith, Randolph A., Ed.; Beins, Barney, Ed.

    2007-01-01

    This book provides a showcase for "best practices" in teaching statistics and research methods in two- and four-year colleges and universities. A helpful resource for teaching introductory, intermediate, and advanced statistics and/or methods, the book features coverage of: (1) ways to integrate these courses; (2) how to promote ethical conduct;

  2. Best Practices in Teaching Statistics and Research Methods in the Behavioral Sciences [with CD-ROM

    ERIC Educational Resources Information Center

    Dunn, Dana S., Ed.; Smith, Randolph A., Ed.; Beins, Barney, Ed.

    2007-01-01

    This book provides a showcase for "best practices" in teaching statistics and research methods in two- and four-year colleges and universities. A helpful resource for teaching introductory, intermediate, and advanced statistics and/or methods, the book features coverage of: (1) ways to integrate these courses; (2) how to promote ethical conduct;…

  3. A Survey of Functional Behavior Assessment Methods Used by Behavior Analysts in Practice

    ERIC Educational Resources Information Center

    Oliver, Anthony C.; Pratt, Leigh A.; Normand, Matthew P.

    2015-01-01

    To gather information about the functional behavior assessment (FBA) methods behavior analysts use in practice, we sent a web-based survey to 12,431 behavior analysts certified by the Behavior Analyst Certification Board. Ultimately, 724 surveys were returned, with the results suggesting that most respondents regularly use FBA methods, especially…

  4. A Survey of Functional Behavior Assessment Methods Used by Behavior Analysts in Practice

    ERIC Educational Resources Information Center

    Oliver, Anthony C.; Pratt, Leigh A.; Normand, Matthew P.

    2015-01-01

    To gather information about the functional behavior assessment (FBA) methods behavior analysts use in practice, we sent a web-based survey to 12,431 behavior analysts certified by the Behavior Analyst Certification Board. Ultimately, 724 surveys were returned, with the results suggesting that most respondents regularly use FBA methods, especially

  5. BAROS METHOD CRITICAL ANALYSIS (BARIATRIC ANALYSIS AND REPORTING SYSTEM)

    PubMed Central

    NICARETA, Jean Ricardo; de FREITAS, Alexandre Coutinho Teixeira; NICARETA, Sheyla Maris; NICARETA, Cleiton; CAMPOS, Antonio Carlos Ligocki; NASSIF, Paulo Afonso Nunes; MARCHESINI, João Batista

    2015-01-01

    Introduction : Although it has received several criticisms, which is considered to be the most effective method used for global assessment of morbid obesity surgical treatment, still needs to be updated. Objective : Critical analysis of BAROS constitution and method. Method : BAROS as headings was searched in literature review using data from the main bariatric surgery journals until 2009. Results : Where found and assessed 121 papers containing criticisms on BAROS constitution and methodology. It has some failures and few researches show results on the use of this instrument, although it is still considered a standard method. Several authors that used it found imperfections in its methodology and suggested some changes addressed to improving its acceptance, showing the need of developing new methods to qualify the bariatric surgery results. Conclusion: BAROS constitution has failures and its methodology needs to be updated. PMID:26537280

  6. Finite Volume Methods: Foundation and Analysis

    NASA Technical Reports Server (NTRS)

    Barth, Timothy; Ohlberger, Mario

    2003-01-01

    Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

  7. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)

    2013-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  8. Multicultural Issues in School Psychology Practice: A Critical Analysis

    ERIC Educational Resources Information Center

    Ortiz, Samuel O.

    2006-01-01

    Once thought of largely as a sideline issue, multiculturalism is fast becoming a major topic on the central stage of psychology and practice. That cultural factors permeate the whole of psychological foundations and influence the manner in which the very scope of practice is shaped is undeniable. The rapidly changing face of the U.S. population…

  9. Situational Analysis: Centerless Systems and Human Service Practices

    ERIC Educational Resources Information Center

    Newbury, Janet

    2011-01-01

    Bronfenbrenner's ecological model is a conceptual framework that continues to contribute to human service practices. In the current article, the author describes the possibilities for practice made intelligible by drawing from this framework. She then explores White's "Web of Praxis" model as an important extension of this approach, and proceeds…

  10. Researching "Practiced Language Policies": Insights from Conversation Analysis

    ERIC Educational Resources Information Center

    Bonacina-Pugh, Florence

    2012-01-01

    In language policy research, "policy" has traditionally been conceptualised as a notion separate from that of "practice". In fact, language practices were usually analysed with a view to evaluate whether a policy is being implemented or resisted to. Recently, however, Spolsky in ("Language policy". Cambridge University press, Cambridge, 2004;…

  11. Best practices: applying management analysis of excellence to immunization.

    PubMed

    Wishner, Amy; Aronson, Jerold; Kohrt, Alan; Norton, Gary

    2005-01-01

    The authors applied business management tools to analyze and promote excellence and to evaluate differences between average and above-average immunization peformers in private practices. The authors conducted a pilot study of 10 private practices in Pennsylvania using tools common in management to assess practices' organizational climate and managerial style. Authoritative and coaching styles of physician leaders were common to both groups. Managerial styles that emphasized higher levels of clarity and responsibility managerial styles were evident in the large practices; and rewards and flexibility styles were higher in the small above-average practices. The findings of this pilot study match results seen in high performers in other industries. It concludes that the authoritative style appears to have the most impact on performance. It has interesting implications for training/behavior change to improve immunization rates, along with traditional medical interventions. PMID:15921143

  12. An unconventional method of quantitative microstructural analysis

    SciTech Connect

    Rastani, M.

    1995-06-01

    The experiment described here introduces a simple methodology which could be used to replace the time-consuming and expensive conventional methods of metallographic and quantitative analysis of thermal treatment effect on microstructure. The method is ideal for the microstructural evaluation of tungsten filaments and other wire samples such as copper wire which can be conveniently coiled. Ten such samples were heat treated by ohmic resistance at temperatures which were expected to span the recrystallization range. After treatment, the samples were evaluated in the elastic recovery test. The normalized elastic recovery factor was defined in terms of these deflections. Experimentally it has shown that elastic recovery factor depends on the degree of recrystallization. In other words this factor is used to determine the fraction of unrecrystallized material. Because the elastic recovery method examines the whole filament rather than just one section through the filament as in metallographical method, it more accurately measures the degree of recrystallization. The method also takes a considerably shorter time and cost compared to the conventional method.

  13. Data Analysis Methods for Library Marketing

    NASA Astrophysics Data System (ADS)

    Minami, Toshiro; Kim, Eunja

    Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

  14. Review of Computational Stirling Analysis Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  15. Optical methods for the analysis of dermatopharmacokinetics

    NASA Astrophysics Data System (ADS)

    Lademann, Juergen; Weigmann, Hans-Juergen; von Pelchrzim, R.; Sterry, Wolfram

    2002-07-01

    The method of tape stripping in combination with spectroscopic measurements is a simple and noninvasive method for the analysis of dermatopharmacokinetics of cosmetic products and topically applied drugs. The absorbance at 430 nm was used for the characterization of the amount of corneocytes on the tape strips. It was compared to the increase of weight of the tapes after removing them from the skin surface. The penetration profiles of two UV filter substances used in sunscreens were determined. The combined method of tape stripping and spectroscopic measurements can be also used for the investigation of the dermatopharmacokinetics of topically applied drugs passing through the skin. Differences in the penetration profiles of the steroid compound clobetasol, applied in the same concentration in different formulations on the skin are presented.

  16. Problems in the fingerprints based polycyclic aromatic hydrocarbons source apportionment analysis and a practical solution.

    PubMed

    Zou, Yonghong; Wang, Lixia; Christensen, Erik R

    2015-10-01

    This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). PMID:26208321

  17. Qualitative Analysis of Common Definitions for Core Advanced Pharmacy Practice Experiences

    PubMed Central

    Danielson, Jennifer; Weber, Stanley S.

    2014-01-01

    Objective. To determine how colleges and schools of pharmacy interpreted the Accreditation Council for Pharmacy Education’s (ACPE’s) Standards 2007 definitions for core advanced pharmacy practice experiences (APPEs), and how they differentiated community and institutional practice activities for introductory pharmacy practice experiences (IPPEs) and APPEs. Methods. A cross-sectional, qualitative, thematic analysis was done of survey data obtained from experiential education directors in US colleges and schools of pharmacy. Open-ended responses to invited descriptions of the 4 core APPEs were analyzed using grounded theory to determine common themes. Type of college or school of pharmacy (private vs public) and size of program were compared. Results. Seventy-one schools (72%) with active APPE programs at the time of the survey responded. Lack of strong frequent themes describing specific activities for the acute care/general medicine core APPE indicated that most respondents agreed on the setting (hospital or inpatient) but the student experience remained highly variable. Themes were relatively consistent between public and private institutions, but there were differences across programs of varying size. Conclusion. Inconsistencies existed in how colleges and schools of pharmacy defined the core APPEs as required by ACPE. More specific descriptions of core APPEs would help to standardize the core practice experiences across institutions and provide an opportunity for quality benchmarking. PMID:24954931

  18. Method of analysis and quality-assurance practices for determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry at the U.S. Geological Survey California District Organic Chemistry Laboratory, 1996-99

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Baker, Lucian M.; Kuivila, Kathryn M.

    2000-01-01

    A method of analysis and quality-assurance practices were developed to study the fate and transport of pesticides in the San Francisco Bay-Estuary by the U.S. Geological Survey. Water samples were filtered to remove suspended-particulate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide and the pesticides were eluted with three cartridge volumes of hexane:diethyl ether (1:1) solution. The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for pesticides ranged from 0.002 to 0.025 microgram per liter for 1-liter samples. Recoveries ranged from 44 to 140 percent for 25 pesticides in samples of organic-free reagent water and Sacramento-San Joaquin Delta and Suisun Bay water fortified at 0.05 and 0.50 microgram per liter. The estimated holding time for pesticides after extraction on C-8 solid-phase extraction cartridges ranged from 10 to 257 days.

  19. A Fourier method for the analysis of exponential decay curves.

    PubMed

    Provencher, S W

    1976-01-01

    A method based on the Fourier convolution theorem is developed for the analysis of data composed of random noise, plus an unknown constant "base line," plus a sum of (or an integral over a continuous spectrum of) exponential decay functions. The Fourier method's usual serious practical limitation of needing high accuracy data over a very wide range is eliminated by the introduction of convergence parameters and a Gaussian taper window. A computer program is described for the analysis of discrete spectra, where the data involves only a sum of exponentials. The program is completely automatic in that the only necessary inputs are the raw data (not necessarily in equal intervals of time); no potentially biased initial guesses concerning either the number or the values of the components are needed. The outputs include the number of components, the amplitudes and time constants together with their estimated errors, and a spectral plot of the solution. The limiting resolving power of the method is studied by analyzing a wide range of simulated two-, three-, and four-component data. The results seem to indicate that the method is applicable over a considerably wider range of conditions than nonlinear least squares or the method of moments. PMID:1244888

  20. Degradation of learned skills: Effectiveness of practice methods on visual approach and landing skill retention

    NASA Technical Reports Server (NTRS)

    Sitterley, T. E.; Zaitzeff, L. P.; Berge, W. A.

    1972-01-01

    Flight control and procedural task skill degradation, and the effectiveness of retraining methods were evaluated for a simulated space vehicle approach and landing under instrument and visual flight conditions. Fifteen experienced pilots were trained and then tested after 4 months either without the benefits of practice or with static rehearsal, dynamic rehearsal or with dynamic warmup practice. Performance on both the flight control and procedure tasks degraded significantly after 4 months. The rehearsal methods effectively countered procedure task skill degradation, while dynamic rehearsal or a combination of static rehearsal and dynamic warmup practice was required for the flight control tasks. The quality of the retraining methods appeared to be primarily dependent on the efficiency of visual cue reinforcement.

  1. A numerical method for the stress analysis of stiffened-shell structures under nonuniform temperature distributions

    NASA Technical Reports Server (NTRS)

    Heldenfels, Richard R

    1951-01-01

    A numerical method is presented for the stress analysis of stiffened-shell structures of arbitrary cross section under nonuniform temperature distributions. The method is based on a previously published procedure that is extended to include temperature effects and multicell construction. The application of the method to practical problems is discussed and an illustrative analysis is presented of a two-cell box beam under the combined action of vertical loads and a nonuniform temperature distribution.

  2. Analysis of structural perturbations in systems via cost decomposition methods

    NASA Technical Reports Server (NTRS)

    Skelton, R. E.

    1983-01-01

    It has long been common practice to analyze linear dynamic systems by decomposing the total response in terms of individual contributions which are easier to analyze. Examples of this philosophy include the expansion of transfer functions using: (1) the superposition principle, (2) residue theory and partial fraction expansions, (3) Markov parameters, Hankel matrices, and (4) regular and singular perturbations. This paper summarizes a new and different kind of expansion designed to decompose the norm of the response vector rather than the response vector itself. This is referred to as "cost-decomposition' of the system. The notable advantages of this type of decomposition are: (a) easy application to multi-input, multi-output systems, (b) natural compatibility with Linear Quadratic Gaussian Theory, (c) applicability to the analysis of more general types of structural perturbations involving inputs, outputs, states, parameters. Property (c) makes the method suitable for problems in model reduction, measurement/actuator selections, and sensitivity analysis.

  3. Foundational methods for model verification and uncertainty analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Jakeman, A. J.; Croke, B. F.; Guillaume, J. H.; Jakeman, J. D.; Shin, M.

    2013-12-01

    Before embarking on formal methods of uncertainty analysis that may entail unnecessarily restrictive assumptions and sophisticated treatment, prudence dictates exploring one's data, model candidates and applicable objective functions with a mixture of methods as a first step. It seems that there are several foundational methods that warrant more attention in practice and that there is scope for the development of new ones. Ensuing results from a selection of foundational methods may well inform the choice of formal methods and assumptions, or suffice in themselves as an effective appreciation of uncertainty. Through the case of four lumped rainfall-runoff models of varying complexity from several watersheds we illustrate that there are valuable methods, many of them already in open source software, others we have recently developed, which can be invoked to yield valuable insights into model veracity and uncertainty. We show results of using methods of global sensitivity analysis that help: determine whether insensitive parameters impact on predictions and therefore cannot be fixed; and identify which combinations of objective function, dataset and model structure allow insensitive parameters to be estimated. We apply response surface and polynomial chaos methods to yield knowledge of the models' response surfaces and parameter interactions, thereby informing model redesign. A new approach to model structure discrimination is presented based on Pareto methods and cross-validation. It reveals which model structures are acceptable in the sense that they are non-dominated by other structures across calibration and validation periods and across catchments according to specified performance criteria. Finally we present and demonstrate a falsification approach that shows the value of examining scenarios of model structures and parameters to identify any change that might have a specified effect on a prediction.

  4. A high-efficiency aerothermoelastic analysis method

    NASA Astrophysics Data System (ADS)

    Wan, ZhiQiang; Wang, YaoKun; Liu, YunZhen; Yang, Chao

    2014-06-01

    In this paper, a high-efficiency aerothermoelastic analysis method based on unified hypersonic lifting surface theory is established. The method adopts a two-way coupling form that couples the structure, aerodynamic force, and aerodynamic thermo and heat conduction. The aerodynamic force is first calculated based on unified hypersonic lifting surface theory, and then the Eckert reference temperature method is used to solve the temperature field, where the transient heat conduction is solved using Fourier's law, and the modal method is used for the aeroelastic correction. Finally, flutter is analyzed based on the p-k method. The aerothermoelastic behavior of a typical hypersonic low-aspect ratio wing is then analyzed, and the results indicate the following: (1) the combined effects of the aerodynamic load and thermal load both deform the wing, which would increase if the flexibility, size, and flight time of the hypersonic aircraft increase; (2) the effect of heat accumulation should be noted, and therefore, the trajectory parameters should be considered in the design of hypersonic flight vehicles to avoid hazardous conditions, such as flutter.

  5. Application of Stacking Technique in ANA: Method and Practice with PKU Seismological Array

    NASA Astrophysics Data System (ADS)

    Liu, J.; Tang, Y.; Ning, J.; Chen, Y. J.

    2010-12-01

    Cross correlation of ambient noise records is now routinely used to get dispersion curve and then do seismic tomography; however little attention has been paid to array techniques. We will present a spacial-stacking method to get high resolution dispersion curves and show practices with the observation data of PKU seismological array. Experiential Green Functions are generally obtained by correlation between two stations, and then the dispersion curves are obtained from the analysis of FTAN. Popular method to get high resolution dispersion curves is using long time records. At the same time, if we want to get effectual signal, the distance between the two stations must be at least 3 times of the longest wavelength. So we need both long time records and appropriate spaced stations. Now we use a new method, special-stacking, which allows shorter observation period and utilizes observations of a group of closely distributed stations to get fine dispersion curves. We correlate observations of every station in the station group with those of a far station, and then stack them together. However we cannot just simply stack them unless the stations in the station group at a circle, of which the center is the far station owing to dispersion characteristics of the Rayleigh waves. Thus we do antidispersion on the observation data of every station in the array, then do stacking. We test the method using the theoretical seismic surface wave records which obtained by qseis06 compiled by Rongjiang Wang both with and without noise. For the cases of three imaginary stations (distance is 1 degree) have the same underground structure and without noise, result is that the center station had the same dispersion with and without spacial-stacking. Then we add noise to the theoretical records. The center station's dispersion curves obtained by our method are much closer to the dispersion curve without noise than contaminated ones. We can see that our method has improved the resolution of the dispersion curve. Then we use the real data from PKU array whose interval is about 10 km and the permanent stations of IRIS which is far (more than 200 km) from PKU array, to test the method. Firstly, we compare the stacked correlation results of the three consecutive stations with uncorrelated ones, finding the resolution of the dispersion curve of the former is better. Secondly, we compare the stacked results with the results of center station's traditional correlation in one year, and find the two fit very well.

  6. Portraits of Practice: A Cross-Case Analysis of Two First-Grade Teachers and Their Grouping Practices

    ERIC Educational Resources Information Center

    Maloch, Beth; Worthy, Jo; Hampton, Angela; Jordan, Michelle; Hungerford-Kresser, Holly; Semingson, Peggy

    2013-01-01

    This interpretive study provides a cross-case analysis of the literacy instruction of two first-grade teachers, with a particular focus on their grouping practices. One key finding was the way in which these teachers drew upon a district-advocated approach for instruction--an approach to guided reading articulated by Fountas and Pinnell (1996) in…

  7. A practical algorithm for static analysis of parallel programs

    SciTech Connect

    McDowell, C.E. )

    1989-06-01

    One approach to analyzing the behavior of a concurrent program requires determining the reachable program states. A program state consists of a set of task states, the values of shared variables used for synchronization, and local variables that derive the values directly from synchronization operations. However, the number of reachable states rises exponentially with the number of tasks and becomes intractable for many concurrent programs. A variation of this approach merges a set of related states into a single virtual state. Using this approach, the analysis of concurrent programs becomes feasible as the number of virtual states is often orders of magnitude less than the number of reachable states. This paper presents a method for determining the virtual states that describe the reachable program states, and the reduction in the number of states is analyzed. The algorithms given have been implemented in a state program analyzer for multitasking Fortran, and the results obtained are discussed.

  8. Simple gas chromatographic method for furfural analysis.

    PubMed

    Gaspar, Elvira M S M; Lopes, João F

    2009-04-01

    A new, simple, gas chromatographic method was developed for the direct analysis of 5-hydroxymethylfurfural (5-HMF), 2-furfural (2-F) and 5-methylfurfural (5-MF) in liquid and water soluble foods, using direct immersion SPME coupled to GC-FID and/or GC-TOF-MS. The fiber (DVB/CAR/PDMS) conditions were optimized: pH effect, temperature, adsorption and desorption times. The method is simple and accurate (RSD<8%), showed good recoveries (77-107%) and good limits of detection (GC-FID: 1.37 microgL(-1) for 2-F, 8.96 microgL(-1) for 5-MF, 6.52 microgL(-1) for 5-HMF; GC-TOF-MS: 0.3, 1.2 and 0.9 ngmL(-1) for 2-F, 5-MF and 5-HMF, respectively). It was applied to different commercial food matrices: honey, white, demerara, brown and yellow table sugars, and white and red balsamic vinegars. This one-step, sensitive and direct method for the analysis of furfurals will contribute to characterise and quantify their presence in the human diet. PMID:18976770

  9. Characterization of polarization-independent phase modulation method for practical plug and play quantum cryptography

    NASA Astrophysics Data System (ADS)

    Kwon, Osung; Lee, Min-Soo; Woo, Min Ki; Park, Byung Kwon; Kim, Il Young; Kim, Yong-Su; Han, Sang-Wook; Moon, Sung

    2015-12-01

    We characterized a polarization-independent phase modulation method, called double phase modulation, for a practical plug and play quantum key distribution (QKD) system. Following investigation of theoretical backgrounds, we applied the method to the practical QKD system and characterized the performance through comparing single phase modulation (SPM) and double phase modulation. Consequently, we obtained repeatable and accurate phase modulation confirmed by high visibility single photon interference even for input signals with arbitrary polarization. Further, the results show that only 80% of the bias voltage required in the case of single phase modulation is needed to obtain the target amount of phase modulation.

  10. Finite element methods for integrated aerodynamic heating analysis

    NASA Technical Reports Server (NTRS)

    Peraire, J.

    1990-01-01

    Over the past few years finite element based procedures for the solution of high speed viscous compressible flows were developed. The objective of this research is to build upon the finite element concepts which have already been demonstrated and to develop these ideas to produce a method which is applicable to the solution of large scale practical problems. The problems of interest range from three dimensional full vehicle Euler simulations to local analysis of three-dimensional viscous laminar flow. Transient Euler flow simulations involving moving bodies are also to be included. An important feature of the research is to be the coupling of the flow solution methods with thermal/structural modeling techniques to provide an integrated fluid/thermal/structural modeling capability. The progress made towards achieving these goals during the first twelve month period of the research is presented.

  11. Practical considerations for volumetric wear analysis of explanted hip arthroplasties

    PubMed Central

    Langton, D. J.; Sidaginamale, R. P.; Holland, J. P.; Deehan, D.; Joyce, T. J.; Nargol, A. V. F.; Meek, R. D.; Lord, J. K.

    2014-01-01

    Objectives Wear debris released from bearing surfaces has been shown to provoke negative immune responses in the recipient. Excessive wear has been linked to early failure of prostheses. Analysis using coordinate measuring machines (CMMs) can provide estimates of total volumetric material loss of explanted prostheses and can help to understand device failure. The accuracy of volumetric testing has been debated, with some investigators stating that only protocols involving hundreds of thousands of measurement points are sufficient. We looked to examine this assumption and to apply the findings to the clinical arena. Methods We examined the effects on the calculated material loss from a ceramic femoral head when different CMM scanning parameters were used. Calculated wear volumes were compared with gold standard gravimetric tests in a blinded study. Results Various scanning parameters including point pitch, maximum point to point distance, the number of scanning contours or the total number of points had no clinically relevant effect on volumetric wear calculations. Gravimetric testing showed that material loss can be calculated to provide clinically relevant degrees of accuracy. Conclusions Prosthetic surfaces can be analysed accurately and rapidly with currently available technologies. Given these results, we believe that routine analysis of explanted hip components would be a feasible and logical extension to National Joint Registries. Cite this article: Bone Joint Res 2014;3:60–8. PMID:24627327

  12. Practical post-calibration uncertainty analysis: Yucca Mountain, Nevada, USA

    NASA Astrophysics Data System (ADS)

    James, S. C.; Doherty, J.; Eddebbarh, A.

    2009-12-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is “calibrated.” Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the US’s proposed site for disposal of high-level radioactive waste. Both of these types uncertainty analysis are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction’s uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This paper applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. Furthermore, Monte Carlo simulations confirm that hydrogeologic units thought to be flow barriers have probability distributions skewed toward lower permeabilities.

  13. A survey of functional behavior assessment methods used by behavior analysts in practice.

    PubMed

    Oliver, Anthony C; Pratt, Leigh A; Normand, Matthew P

    2015-12-01

    To gather information about the functional behavior assessment (FBA) methods behavior analysts use in practice, we sent a web-based survey to 12,431 behavior analysts certified by the Behavior Analyst Certification Board. Ultimately, 724 surveys were returned, with the results suggesting that most respondents regularly use FBA methods, especially descriptive assessments. Moreover, the data suggest that the majority of students are being formally taught about the various FBA methods and that educators are emphasizing the range of FBA methods in their teaching. However, less than half of the respondents reported using functional analyses in practice, although many considered descriptive assessments and functional analyses to be the most useful FBA methods. Most respondents reported using informant and descriptive assessments more frequently than functional analyses, and a majority of respondents indicated that they "never" or "almost never" used functional analyses to identify the function of behavior. PMID:26411336

  14. Practical analysis of specificity-determining residues in protein families.

    PubMed

    Chagoyen, Mónica; García-Martín, Juan A; Pazos, Florencio

    2016-03-01

    Determining the residues that are important for the molecular activity of a protein is a topic of broad interest in biomedicine and biotechnology. This knowledge can help understanding the protein's molecular mechanism as well as to fine-tune its natural function eventually with biotechnological or therapeutic implications. Some of the protein residues are essential for the function common to all members of a family of proteins, while others explain the particular specificities of certain subfamilies (like binding on different substrates or cofactors and distinct binding affinities). Owing to the difficulty in experimentally determining them, a number of computational methods were developed to detect these functional residues, generally known as 'specificity-determining positions' (or SDPs), from a collection of homologous protein sequences. These methods are mature enough for being routinely used by molecular biologists in directing experiments aimed at getting insight into the functional specificity of a family of proteins and eventually modifying it. In this review, we summarize some of the recent discoveries achieved through SDP computational identification in a number of relevant protein families, as well as the main approaches and software tools available to perform this type of analysis. PMID:26141829

  15. New approaches to fertility awareness-based methods: incorporating the Standard Days and TwoDay Methods into practice.

    PubMed

    Germano, Elaine; Jennings, Victoria

    2006-01-01

    Helping clients select and use appropriate family planning methods is a basic component of midwifery care. Many women prefer nonhormonal, nondevice methods, and may be interested in methods that involve understanding their natural fertility. Two new fertility awareness-based methods, the Standard Days Method and the TwoDay Method, meet the need for effective, easy-to-provide, easy-to-use approaches. The Standard Days Method is appropriate for women with most menstrual cycles between 26 and 32 days long. Women using this method are taught to avoid unprotected intercourse on potentially fertile days 8 through 19 of their cycles to prevent pregnancy. They use CycleBeads, a color-coded string of beads representing the menstrual cycle, to monitor their cycle days and cycle lengths. The Standard Days Method is more than 95% effective with correct use. The TwoDay Method is based on the presence or absence of cervical secretions to identify fertile days. To use this method, women are taught to note everyday whether they have secretions. If they had secretions on the current day or the previous day, they consider themselves fertile. The TwoDay Method is 96% effective with correct use. Both methods fit well into midwifery practice. PMID:17081938

  16. Thermal Analysis Methods for Aerobraking Heating

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.; Gasbarre, Joseph F.; Dec, John A.

    2005-01-01

    As NASA begins exploration of other planets, a method of non-propulsively slowing vehicles at the planet, aerobraking, may become a valuable technique for managing vehicle design mass and propellant. An example of this is Mars Reconnaissance Orbiter (MRO), which will launch in late 2005 and reach Mars in March of 2006. In order to save propellant, MRO will use aerobraking to modify the initial orbit at Mars. The spacecraft will dip into the atmosphere briefly on each orbit, and during the drag pass, the atmospheric drag on the spacecraft will slow it, thus lowering the orbit apoapsis. The largest area on the spacecraft, and that most affected by the heat generated during the aerobraking process, is the solar arrays. A thermal analysis of the solar arrays was conducted at NASA Langley, to simulate their performance throughout the entire roughly 6-month period of aerobraking. Several interesting methods were used to make this analysis more rapid and robust. Two separate models were built for this analysis, one in Thermal Desktop for radiation and orbital heating analysis, and one in MSC.Patran for thermal analysis. The results from the radiation model were mapped in an automated fashion to the Patran thermal model that was used to analyze the thermal behavior during the drag pass. A high degree of automation in file manipulation as well as other methods for reducing run time were employed, since toward the end of the aerobraking period the orbit period is short, and in order to support flight operations the runs must be computed rapidly. All heating within the Patran Thermal model was combined in one section of logic, such that data mapped from the radiation model and aeroheating model, as well as skin temperature effects on the aeroheating and surface radiation, could be incorporated easily. This approach calculates the aeroheating at any given node, based on its position and temperature as well as the density and velocity at that trajectory point. Run times on several different processors, computer hard drives, and operating systems (Windows versus Linux) were evaluated.

  17. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  18. Apparatus and method for fluid analysis

    DOEpatents

    Wilson, Bary W.; Peters, Timothy J.; Shepard, Chester L.; Reeves, James H.

    2004-11-02

    The present invention is an apparatus and method for analyzing a fluid used in a machine or in an industrial process line. The apparatus has at least one meter placed proximate the machine or process line and in contact with the machine or process fluid for measuring at least one parameter related to the fluid. The at least one parameter is a standard laboratory analysis parameter. The at least one meter includes but is not limited to viscometer, element meter, optical meter, particulate meter, and combinations thereof.

  19. Apparatus And Method For Fluid Analysis

    DOEpatents

    Wilson, Bary W.; Peters, Timothy J.; Shepard, Chester L.; Reeves, James H.

    2003-05-13

    The present invention is an apparatus and method for analyzing a fluid used in a machine or in an industrial process line. The apparatus has at least one meter placed proximate the machine or process line and in contact with the machine or process fluid for measuring at least one parameter related to the fluid. The at least one parameter is a standard laboratory analysis parameter. The at least one meter includes but is not limited to viscometer, element meter, optical meter, particulate meter, and combinations thereof.

  20. Scanning methods applied to bitemark analysis

    NASA Astrophysics Data System (ADS)

    Bush, Peter J.; Bush, Mary A.

    2010-06-01

    The 2009 National Academy of Sciences report on forensics focused criticism on pattern evidence subdisciplines in which statements of unique identity are utilized. One principle of bitemark analysis is that the human dentition is unique to the extent that a perpetrator may be identified based on dental traits in a bitemark. Optical and electron scanning methods were used to measure dental minutia and to investigate replication of detail in human skin. Results indicated that being a visco-elastic substrate, skin effectively reduces the resolution of measurement of dental detail. Conclusions indicate caution in individualization statements.

  1. Method and apparatus for simultaneous spectroelectrochemical analysis

    DOEpatents

    Chatterjee, Sayandev; Bryan, Samuel A; Schroll, Cynthia A; Heineman, William R

    2013-11-19

    An apparatus and method of simultaneous spectroelectrochemical analysis is disclosed. A transparent surface is provided. An analyte solution on the transparent surface is contacted with a working electrode and at least one other electrode. Light from a light source is focused on either a surface of the working electrode or the analyte solution. The light reflected from either the surface of the working electrode or the analyte solution is detected. The potential of the working electrode is adjusted, and spectroscopic changes of the analyte solution that occur with changes in thermodynamic potentials are monitored.

  2. Numerical analysis method for linear induction machines.

    NASA Technical Reports Server (NTRS)

    Elliott, D. G.

    1972-01-01

    A numerical analysis method has been developed for linear induction machines such as liquid metal MHD pumps and generators and linear motors. Arbitrary phase currents or voltages can be specified and the moving conductor can have arbitrary velocity and conductivity variations from point to point. The moving conductor is divided into a mesh and coefficients are calculated for the voltage induced at each mesh point by unit current at every other mesh point. Combining the coefficients with the mesh resistances yields a set of simultaneous equations which are solved for the unknown currents.

  3. Text analysis devices, articles of manufacture, and text analysis methods

    DOEpatents

    Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C

    2015-03-31

    Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes a display configured to depict visible images, and processing circuitry coupled with the display and wherein the processing circuitry is configured to access a first vector of a text item and which comprises a plurality of components, to access a second vector of the text item and which comprises a plurality of components, to weight the components of the first vector providing a plurality of weighted values, to weight the components of the second vector providing a plurality of weighted values, and to combine the weighted values of the first vector with the weighted values of the second vector to provide a third vector.

  4. Internet Practices of Certified Rehabilitation Counselors and Analysis of Guidelines for Ethical Internet Practices

    ERIC Educational Resources Information Center

    Lehmann, Ilana S.; Crimando, William

    2011-01-01

    The Internet has become an integral part of the practice of rehabilitation counseling. To identify potential ethical issues regarding the use of the Internet by counselors, two studies were conducted. In Study 1, we surveyed a national sample of rehabilitation counselors regarding their use of technology in their work and home settings. Results…

  5. Selective spectroscopic methods for water analysis

    SciTech Connect

    Vaidya, B.

    1997-06-24

    This dissertation explores in large part the development of a few types of spectroscopic methods in the analysis of water. Methods for the determination of some of the most important properties of water like pH, metal ion content, and chemical oxygen demand are investigated in detail. This report contains a general introduction to the subject and the conclusions. Four chapters and an appendix have been processed separately. They are: chromogenic and fluorogenic crown ether compounds for the selective extraction and determination of Hg(II); selective determination of cadmium in water using a chromogenic crown ether in a mixed micellar solution; reduction of chloride interference in chemical oxygen demand determination without using mercury salts; structural orientation patterns for a series of anthraquinone sulfonates adsorbed at an aminophenol thiolate monolayer chemisorbed at gold; and the role of chemically modified surfaces in the construction of miniaturized analytical instrumentation.

  6. Effectiveness of a Motivation and Practical Skills Development Methods on the Oral Hygiene of Orphans Children in Kaunas, Lithuania

    PubMed Central

    Narbutaite, Julija

    2015-01-01

    ABSTRACT Objectives The aim of this study was to evaluate the effect of a motivation and practical skills development methods on the oral hygiene of orphans. Material and Methods Sixty eight orphans aged between 7 and 17 years from two orphanages in Kaunas were divided into two groups: practical application group and motivation group. Children were clinically examined by determining their oral hygiene status using Silness-Löe plaque index. Questionnaire was used to estimate the oral hygiene knowledge and practices at baseline and after 3 months. Statistical analysis included: Chi-square test (χ2), Fisher‘s exact test, Student‘s t-test, nonparametric Mann-Whitney test, Spearman’s rho correlation coefficient and Kappa coefficient. Results All children had a plaque on at least one tooth in both groups: motivation 1.14 (SD 0.51), practical application 1.08 (SD 0.4) (P = 0.58). Girls in both groups showed significantly better oral hygiene than boys (P < 0.001). After 3 months educational program oral hygiene status improved in both groups significantly 0.4 (SD 0.35) (P < 0.001). Significantly better oral hygiene was determined in practical application group 0.19 (SD 0.27) in comparison with motivation group 0.55 (SD 0.32) (P < 0.001). By comparing results of first and second questionnaire surveys on use of soft drinks, the statistically significant decline of their use was in both groups (P = 0.004). Conclusions Educational programs are effective in improving oral hygiene, especially when they’re based on practical skills training. PMID:26539284

  7. Putting social impact assessment to the test as a method for implementing responsible tourism practice

    SciTech Connect

    McCombes, Lucy; Vanclay, Frank; Evers, Yvette

    2015-11-15

    The discourse on the social impacts of tourism needs to shift from the current descriptive critique of tourism to considering what can be done in actual practice to embed the management of tourism's social impacts into the existing planning, product development and operational processes of tourism businesses. A pragmatic approach for designing research methodologies, social management systems and initial actions, which is shaped by the real world operational constraints and existing systems used in the tourism industry, is needed. Our pilot study with a small Bulgarian travel company put social impact assessment (SIA) to the test to see if it could provide this desired approach and assist in implementing responsible tourism development practice, especially in small tourism businesses. Our findings showed that our adapted SIA method has value as a practical method for embedding a responsible tourism approach. While there were some challenges, SIA proved to be effective in assisting the staff of our test case tourism business to better understand their social impacts on their local communities and to identify actions to take. - Highlights: • Pragmatic approach is needed for the responsible management of social impacts of tourism. • Our adapted Social impact Assessment (SIA) method has value as a practical method. • SIA can be embedded into tourism businesses existing ‘ways of doing things’. • We identified challenges and ways to improve our method to better suit small tourism business context.

  8. Common Goals for the Science and Practice of Behavior Analysis: A Response to Critchfield

    ERIC Educational Resources Information Center

    Schneider, Susan M.

    2012-01-01

    In his scholarly and thoughtful article, "Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis," Critchfield (2011) discussed the science-practice frictions to be expected in any professional organization that attempts to combine these interests. He suggested that the Association for Behavior Analysis

  9. Translating Evidence Into Practice via Social Media: A Mixed-Methods Study

    PubMed Central

    Tunnecliff, Jacqueline; Morgan, Prue; Gaida, Jamie E; Clearihan, Lyn; Sadasivan, Sivalal; Davies, David; Ganesh, Shankar; Mohanty, Patitapaban; Weiner, John; Reynolds, John; Ilic, Dragan

    2015-01-01

    Background Approximately 80% of research evidence relevant to clinical practice never reaches the clinicians delivering patient care. A key barrier for the translation of evidence into practice is the limited time and skills clinicians have to find and appraise emerging evidence. Social media may provide a bridge between health researchers and health service providers. Objective The aim of this study was to determine the efficacy of social media as an educational medium to effectively translate emerging research evidence into clinical practice. Methods The study used a mixed-methods approach. Evidence-based practice points were delivered via social media platforms. The primary outcomes of attitude, knowledge, and behavior change were assessed using a preintervention/postintervention evaluation, with qualitative data gathered to contextualize the findings. Results Data were obtained from 317 clinicians from multiple health disciplines, predominantly from the United Kingdom, Australia, the United States, India, and Malaysia. The participants reported an overall improvement in attitudes toward social media for professional development (P<.001). The knowledge evaluation demonstrated a significant increase in knowledge after the training (P<.001). The majority of respondents (136/194, 70.1%) indicated that the education they had received via social media had changed the way they practice, or intended to practice. Similarly, a large proportion of respondents (135/193, 69.9%) indicated that the education they had received via social media had increased their use of research evidence within their clinical practice. Conclusions Social media may be an effective educational medium for improving knowledge of health professionals, fostering their use of research evidence, and changing their clinical behaviors by translating new research evidence into clinical practice. PMID:26503129

  10. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... enforcement programs to utilize the methods of analysis of the AOAC INTERNATIONAL (AOAC) as published in...

  11. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 1 2011-04-01 2011-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... enforcement programs to utilize the methods of analysis of the AOAC INTERNATIONAL (AOAC) as published in...

  12. A Simplified Mesh Deformation Method Using Commercial Structural Analysis Software

    NASA Technical Reports Server (NTRS)

    Hsu, Su-Yuen; Chang, Chau-Lyan; Samareh, Jamshid

    2004-01-01

    Mesh deformation in response to redefined or moving aerodynamic surface geometries is a frequently encountered task in many applications. Most existing methods are either mathematically too complex or computationally too expensive for usage in practical design and optimization. We propose a simplified mesh deformation method based on linear elastic finite element analyses that can be easily implemented by using commercially available structural analysis software. Using a prescribed displacement at the mesh boundaries, a simple structural analysis is constructed based on a spatially varying Young s modulus to move the entire mesh in accordance with the surface geometry redefinitions. A variety of surface movements, such as translation, rotation, or incremental surface reshaping that often takes place in an optimization procedure, may be handled by the present method. We describe the numerical formulation and implementation using the NASTRAN software in this paper. The use of commercial software bypasses tedious reimplementation and takes advantage of the computational efficiency offered by the vendor. A two-dimensional airfoil mesh and a three-dimensional aircraft mesh were used as test cases to demonstrate the effectiveness of the proposed method. Euler and Navier-Stokes calculations were performed for the deformed two-dimensional meshes.

  13. Analysis of nonstandard and home-made explosives and post-blast residues in forensic practice

    NASA Astrophysics Data System (ADS)

    Kotrlý, Marek; Turková, Ivana

    2014-05-01

    Nonstandard and home-made explosives may constitute a considerable threat and as well as a potential material for terrorist activities. Mobile analytical devices, particularly Raman, or also FTIR spectrometers are used for the initial detection. Various sorts of phlegmatizers (moderants) to decrease sensitivity of explosives were tested, some kinds of low viscosity lubricants yielded very good results. If the character of the substance allows it, phlegmatized samples are taken in the amount of approx.0.3g for a laboratory analysis. Various separation methods and methods of concentrations of samples from post-blast scenes were tested. A wide range of methods is used for the laboratory analysis. XRD techniques capable of a direct phase identification of the crystalline substance, namely in mixtures, have highly proved themselves in practice for inorganic and organic phases. SEM-EDS/WDS methods are standardly employed for the inorganic phase. In analysing post-blast residues, there are very important techniques allowing analysis at the level of separate particles, not the overall composition in a mixed sample.

  14. A Mixed Methods Content Analysis of the Research Literature in Science Education

    ERIC Educational Resources Information Center

    Schram, Asta B.

    2014-01-01

    In recent years, more and more researchers in science education have been turning to the practice of combining qualitative and quantitative methods in the same study. This approach of using mixed methods creates possibilities to study the various issues that science educators encounter in more depth. In this content analysis, I evaluated 18…

  15. Stirling Analysis Comparison of Commercial Versus High-Order Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako

    2005-01-01

    Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's compact scheme and Dyson's Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model with sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

  16. Stirling Analysis Comparison of Commercial vs. High-Order Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako

    2007-01-01

    Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/ proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's Compact scheme and Dyson s Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model although sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

  17. Impact of pedagogical method on Brazilian dental students' waste management practice.

    PubMed

    Victorelli, Gabriela; Flório, Flávia Martão; Ramacciato, Juliana Cama; Motta, Rogério Heládio Lopes; de Souza Fonseca Silva, Almenara

    2014-11-01

    The purpose of this study was to conduct a qualitative analysis of waste management practices among a group of Brazilian dental students (n=64) before and after implementing two different pedagogical methods: 1) the students attended a two-hour lecture based on World Health Organization standards; and 2) the students applied the lessons learned in an organized group setting aimed toward raising their awareness about socioenvironmental issues related to waste. All eligible students participated, and the students' learning was evaluated through their answers to a series of essay questions, which were quantitatively measured. Afterwards, the impact of the pedagogical approaches was compared by means of qualitative categorization of wastes generated in clinical activities. Waste categorization was performed for a period of eight consecutive days, both before and thirty days after the pedagogical strategies. In the written evaluation, 80 to 90 percent of the students' answers were correct. The qualitative assessment revealed a high frequency of incorrect waste disposal with a significant increase of incorrect disposal inside general and infectious waste containers (p<0.05). Although the students' theoretical learning improved, it was not enough to change behaviors established by cultural values or to encourage the students to adequately segregate and package waste material. PMID:25362694

  18. Practical method for evaluating the visibility of moire patterns for CRT design

    NASA Astrophysics Data System (ADS)

    Shiramatsu, Naoki; Tanigawa, Masashi; Iwata, Shuji

    1995-04-01

    The high resolution CRT displays used for computer monitor and high performance TV often produce a pattern of bright and dark stripes on the screen called a moire pattern. The elimination of the moire is an important consideration in the CRT design. The objective of this study is to provide a practical method for estimating and evaluating a moire pattern considering the visibility by the human vision. On the basis of the mathematical model of a moire generation, precise value of the period and the intensity of a moire are calculated from the actual data of the electron beam profile and the transmittance distribution of apertures of the shadow mask. The visibility of the moire is evaluated by plotting the calculation results on the contrast-period plane, which consists of visible and invisible moire pattern regions based on experimental results of the psychological tests. Not only fundamental design parameters such as a shadow mask pitch and a scanning line pitch but also details of an electron beam profile such as a distortion or an asymmetry can be examined. In addition to the analysis, the image simulation of a moire using the image memory is also available.

  19. Moving environmental DNA methods from concept to practice for monitoring aquatic macroorganisms

    USGS Publications Warehouse

    Goldberg, Caren S.; Strickler, Katherine M.; Pilliod, David S.

    2015-01-01

    The discovery that macroorganisms can be detected from their environmental DNA (eDNA) in aquatic systems has immense potential for the conservation of biological diversity. This special issue contains 11 papers that review and advance the field of eDNA detection of vertebrates and other macroorganisms, including studies of eDNA production, transport, and degradation; sample collection and processing to maximize detection rates; and applications of eDNA for conservation using citizen scientists. This body of work is an important contribution to the ongoing efforts to take eDNA detection of macroorganisms from technical breakthrough to established, reliable method that can be used in survey, monitoring, and research applications worldwide. While the rapid advances in this field are remarkable, important challenges remain, including consensus on best practices for collection and analysis, understanding of eDNA diffusion and transport, and avoidance of inhibition in sample collection and processing. Nonetheless, as demonstrated in this special issue, eDNA techniques for research and monitoring are beginning to realize their potential for contributing to the conservation of biodiversity globally.

  20. Practical Implementation of New Particle Tracking Method to the Real Field of Groundwater Flow and Transport

    PubMed Central

    Suk, Heejun

    2012-01-01

    Abstract In articles published in 2009 and 2010, Suk and Yeh reported the development of an accurate and efficient particle tracking algorithm for simulating a path line under complicated unsteady flow conditions, using a range of elements within finite elements in multidimensions. Here two examples, an aquifer storage and recovery (ASR) example and a landfill leachate migration example, are examined to enhance the practical implementation of the proposed particle tracking method, known as Suk's method, to a real field of groundwater flow and transport. Results obtained by Suk's method are compared with those obtained by Pollock's method. Suk's method produces superior tracking accuracy, which suggests that Suk's method can describe more accurately various advection-dominated transport problems in a real field than existing popular particle tracking methods, such as Pollock's method. To illustrate the wide and practical applicability of Suk's method to random-walk particle tracking (RWPT), the original RWPT has been modified to incorporate Suk's method. Performance of the modified RWPT using Suk's method is compared with the original RWPT scheme by examining the concentration distributions obtained by the modified RWPT and the original RWPT under complicated transient flow systems. PMID:22476629

  1. Mixed-methods research in pharmacy practice: basics and beyond (part 1).

    PubMed

    Hadi, Muhammad Abdul; Alldred, David Phillip; Closs, S José; Briggs, Michelle

    2013-10-01

    This is the first of two papers which explore the use of mixed-methods research in pharmacy practice. In an era of evidence-based medicine and policy, high-quality research evidence is essential for the development of effective pharmacist-led services. Over the past decade, the use of mixed-methods research has become increasingly common in healthcare, although to date its use has been relatively limited in pharmacy practice research. In this article, the basic concepts of mixed-methods research including its definition, typologies and advantages in relation to pharmacy practice research are discussed. Mixed-methods research brings together qualitative and quantitative methodologies within a single study to answer or understand a research problem. There are a number of mixed-methods designs available, but the selection of an appropriate design must always be dictated by the research question. Importantly, mixed-methods research should not be seen as a 'tool' to collect qualitative and quantitative data, rather there should be some degree of 'integration' between the two data sets. If conducted appropriately, mixed-methods research has the potential to generate quality research evidence by combining strengths and overcoming the respective limitations of qualitative and quantitative methodologies. PMID:23418918

  2. Methods of analysis of protein crystal images

    NASA Astrophysics Data System (ADS)

    Zuk, William M.; Ward, Keith B.

    1991-03-01

    Several protein crystallization techniques, including the vapor diffusion method, lend themselves well to automation techniques. Up until the present time, automation techniques have been restricted to setting up crystallization experiments, and procedures to monitor and analyze the experiments have not been developed. These procedures require additional hardware for video monitoring of crystallization chambers and automatic recognition of protein crystals. An automated image acquisition and analysis system makes use of both image processing routines and pattern recognition procedures. In order to design and implement such a system, we are presently developing algorithms which can recognize and locate protein crystals in video images of crystallization droplets. Images of crystallization experiments are acquired and digitized, and analyses of the droplet images are conducted on the microcomputer which also acts as a host in our laboratory robotics system. We describe here our current progress in designing the image analysis system, including the development of appropriate pattern recognition methods. In addition, the usefulness of various pattern recognition schemes for monitoring the progress of crystallization is explored.

  3. Advances in quantitative electroencephalogram analysis methods.

    PubMed

    Thakor, Nitish V; Tong, Shanbao

    2004-01-01

    Quantitative electroencephalogram (qEEG) plays a significant role in EEG-based clinical diagnosis and studies of brain function. In past decades, various qEEG methods have been extensively studied. This article provides a detailed review of the advances in this field. qEEG methods are generally classified into linear and nonlinear approaches. The traditional qEEG approach is based on spectrum analysis, which hypothesizes that the EEG is a stationary process. EEG signals are nonstationary and nonlinear, especially in some pathological conditions. Various time-frequency representations and time-dependent measures have been proposed to address those transient and irregular events in EEG. With regard to the nonlinearity of EEG, higher order statistics and chaotic measures have been put forward. In characterizing the interactions across the cerebral cortex, an information theory-based measure such as mutual information is applied. To improve the spatial resolution, qEEG analysis has also been combined with medical imaging technology (e.g., CT, MR, and PET). With these advances, qEEG plays a very important role in basic research and clinical studies of brain injury, neurological disorders, epilepsy, sleep studies and consciousness, and brain function. PMID:15255777

  4. The practice patterns of second trimester fetal ultrasonography: A questionnaire survey and an analysis of checklists

    PubMed Central

    Park, Hyun Soo; Hong, Joon Seok; Seol, Hyun-Joo; Hwang, Han Sung; Kim, Kunwoo; Ko, Hyun Sun; Kwak, Dong-Wook; Oh, Soo-young; Kim, Moon Young; Kim, Sa Jin

    2015-01-01

    Objective To analyze practice patterns and checklists of second trimester ultrasonography, and to investigate management plans when soft markers are detected among Korean Society of Ultrasound in Obstetrics and Gynecology (KSUOG) members. Methods An internet-based self-administered questionnaire survey was designed. KSUOG members were invited to the survey. Checklists of the second trimester ultrasonography were also requested. In the questionnaire survey, general practice patterns of the second trimester ultrasonography and management schemes of soft markers were asked. In the checklists analysis, the number of items were counted and also compared with those recommended by other medical societies. Results A total of 101 members responded. Eighty-seven percent routinely recommended second trimester fetal anatomic surveillance. Most (91.1%) performed it between 20+0 and 23+6 weeks of gestation. Written informed consents were given by 15.8% of respondents. Nearly 60% recommended genetic counseling when multiple soft markers and/or advanced maternal age were found. Similar tendencies were found in the managements of individual soft markers. However, practice patterns were very diverse and sometimes conflicting. Forty-eight checklists were analyzed in context with the number and content of the items. The median item number was 46.5 (range, 17 to 109). Of 49 items of checklists recommended by International Society of Ultrasound in Obstetrics and Gynecology and/or American Congress of Obstetricians and Gynecologists, 14 items (28.6%) were found in less than 50% of the checklists analyzed in this study. Conclusion Although general practice patterns were similar among KSUOG members, some of which were conflicting, and there is a need for standardization of the practice patterns and checklists of second trimester ultrasonography, which also have very wide range of spectrum. PMID:26623407

  5. Cleanup standards and pathways analysis methods

    SciTech Connect

    Devgun, J.S.

    1993-09-01

    Remediation of a radioactively contaminated site requires that certain regulatory criteria be met before the site can be released for unrestricted future use. Since the ultimate objective of remediation is to protect the public health and safety, residual radioactivity levels remaining at a site after cleanup must be below certain preset limits or meet acceptable dose or risk criteria. Release of a decontaminated site requires proof that the radiological data obtained from the site meet the regulatory criteria for such a release. Typically release criteria consist of a composite of acceptance limits that depend on the radionuclides, the media in which they are present, and federal and local regulations. In recent years, the US Department of Energy (DOE) has developed a pathways analysis model to determine site-specific soil activity concentration guidelines for radionuclides that do not have established generic acceptance limits. The DOE pathways analysis computer code (developed by Argonne National Laboratory for the DOE) is called RESRAD (Gilbert et al. 1989). Similar efforts have been initiated by the US Nuclear Regulatory Commission (NRC) to develop and use dose-related criteria based on genetic pathways analyses rather than simplistic numerical limits on residual radioactivity. The focus of this paper is radionuclide contaminated soil. Cleanup standards are reviewed, pathways analysis methods are described, and an example is presented in which RESRAD was used to derive cleanup guidelines.

  6. Chapter 11. Community analysis-based methods

    SciTech Connect

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  7. Green analytical method development for statin analysis.

    PubMed

    Assassi, Amira Louiza; Roy, Claude-Eric; Perovitch, Philippe; Auzerie, Jack; Hamon, Tiphaine; Gaudin, Karen

    2015-02-01

    Green analytical chemistry method was developed for pravastatin, fluvastatin and atorvastatin analysis. HPLC/DAD method using ethanol-based mobile phase with octadecyl-grafted silica with various grafting and related-column parameters such as particle sizes, core-shell and monolith was studied. Retention, efficiency and detector linearity were optimized. Even for column with particle size under 2 μm, the benefit of keeping efficiency within a large range of flow rate was not obtained with ethanol based mobile phase compared to acetonitrile one. Therefore the strategy to shorten analysis by increasing the flow rate induced decrease of efficiency with ethanol based mobile phase. An ODS-AQ YMC column, 50 mm × 4.6 mm, 3 μm was selected which showed the best compromise between analysis time, statin separation, and efficiency. HPLC conditions were at 1 mL/min, ethanol/formic acid (pH 2.5, 25 mM) (50:50, v/v) and thermostated at 40°C. To reduce solvent consumption for sample preparation, 0.5mg/mL concentration of each statin was found the highest which respected detector linearity. These conditions were validated for each statin for content determination in high concentrated hydro-alcoholic solutions. Solubility higher than 100mg/mL was found for pravastatin and fluvastatin, whereas for atorvastatin calcium salt the maximum concentration was 2mg/mL for hydro-alcoholic binary mixtures between 35% and 55% of ethanol in water. Using atorvastatin instead of its calcium salt, solubility was improved. Highly concentrated solution of statins offered potential fluid for per Buccal Per-Mucous(®) administration with the advantages of rapid and easy passage of drugs. PMID:25582487

  8. A concise method for mine soils analysis

    SciTech Connect

    Winkler, S.; Wildeman, T.; Robinson, R.; Herron, J.

    1999-07-01

    A large number of abandoned hard rock mines exist in Colorado and other mountain west states, many on public property. Public pressure and resulting policy changes have become a driving force in the reclamation of these sites. Two of the key reclamation issues for these sites in the occurrence of acid forming materials (AFMs) in mine soils, and acid mine drainage (AMD) issuing from mine audits. An AMD treatment system design project for the Forest Queen mine in Colorado's San Juan mountains raised the need for a simple, useable method for analysis of mine land soils, both for suitability as a construction material, and to determine the AFM content and potential for acid release. The authors have developed a simple, stepwise, go - no go test for the analysis of mine soils. Samples were collected from a variety of sites in the Silverton, CO area, and subjected to three tiers of tests including: paste pH, Eh, and 10% HCl fizz test; then total digestion in HNO{sub 3}/HCl, neutralization potential, exposure to meteoric water, and toxicity content leaching procedure (TCLP). All elemental analyses were performed with an inductively-coupled plasma (ICP) spectrometer. Elimination of samples via the first two testing tiers left two remaining samples, which were subsequently subjected to column and sequential batch tests, with further elemental analysis by ICP. Based on these tests, one sample was chosen for suitability as a constructing material for the Forest Queen treatment system basins. Further simplification, and testing on two pairs of independent soil samples, has resulted in a final analytical method suitable for general use.

  9. International Commercial Remote Sensing Practices and Policies: A Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Stryker, Timothy

    In recent years, there has been much discussion about U.S. commercial remoteUnder the Act, the Secretary of Commerce sensing policies and how effectively theylicenses the operations of private U.S. address U.S. national security, foreignremote sensing satellite systems, in policy, commercial, and public interests.consultation with the Secretaries of Defense, This paper will provide an overview of U.S.State, and Interior. PDD-23 provided further commercial remote sensing laws,details concerning the operation of advanced regulations, and policies, and describe recentsystems, as well as criteria for the export of NOAA initiatives. It will also addressturnkey systems and/or components. In July related foreign practices, and the overall2000, pursuant to the authority delegated to legal context for trade and investment in thisit by the Secretary of Commerce, NOAA critical industry.iss ued new regulations for the industry. Licensing and Regulationsatellite systems. NOAA's program is The 1992 Land Remote Sensing Policy Act ("the Act"), and the 1994 policy on Foreign Access to Remote Sensing Space Capabilities (known as Presidential Decision Directive-23, or PDD-23) put into place an ambitious legal and policy framework for the U.S. Government's licensing of privately-owned, high-resolution satellite systems. Previously, capabilities afforded national security and observes the international obligations of the United States; maintain positive control of spacecraft operations; maintain a tasking record in conjunction with other record-keeping requirements; provide U.S. Government access to and use of data when required for national security or foreign policy purposes; provide for U.S. Government review of all significant foreign agreements; obtain U.S. Government approval for any encryption devices used; make available unenhanced data to a "sensed state" as soon as such data are available and on reasonable cost terms and conditions; make available unenhanced data as requested by the U.S. Government Archive; and, obtain a priori U.S. Government approval of all plans and procedures to deal with safe disposition of the satellite. Further information on NOAA's regulations and NOAA's licensing program is available at www.licensing.noaa.gov. Monitoring and Enforcement NOAA's enforcement mission is focused on the legislative mandate which states that the Secretary of Commerce has a continuing obligation to ensure that licensed imaging systems are operated lawfully to preserve the national security and foreign policies of the United States. NOAA has constructed an end-to-end monitoring and compliance program to review the activities of licensed companies. This program includes a pre- launch review, an operational baseline audit, and an annual comprehensive national security audit. If at any time there is suspicion or concern that a system is being operated unlawfully, a no-notice inspection may be initiated. setbacks, three U.S. companies are now operational, with more firms expected to become so in the future. While NOAA does not disclose specific systems capabilities for proprietary reasons, its current licensing resolution thresholds for general commercial availability are as follows: 0.5 meter Ground Sample Distance (GSD) for panchromatic systems, 2 meter GSD for multi-spectral systems, 3 meter Impulse Response (IPR) for Synthetic Aperture Radar systems, and 20 meter GSD for hyperspectral systems (with certain 8-meter hyperspectral derived products also licensed for commercial distribution). These thresholds are subject to change based upon foreign availability and other considerations. It should also be noted that license applications are reviewed and granted on a case-by-case basis, pursuant to each system's technology and concept of operations. In 2001, NOAA, along with the Department of Commerce's International Trade Administration, commissioned a study by the RAND Corporation to assess the risks faced by the U.S. commercial remote sensing satellite industry. In commissioning this study, NOAA's goal was to better understand the role that U.S. Government policies and regulations have in shaping the prospects for emerging commercial remote sensing satellite firms. The study assessed the risks against broader trends in the larger U.S. remote sensing industry and geospatial technology and effective policy implementation. The Department of Commerce is working with NOAA licensees to identify foreign actions which could restrict market access by U.S. firms, and seeking to provide a "level playing field" for U.S. service providers. The Department of Commerce has dedicated new resources to its licensing activities. In Fiscal Year 2002, the Department obtained 1.2 million in funding to support the NOAA program, through staff, equipment, technical support, constituent outreach, and market and policy studies. To better understand the market and make more well-informed licensing decisions, NOAA is participating in a broad-based market study effort under the direction of the American Society for Photogrammetry and Remote Sensing (ASPRS) and NASA's Commercial Remote Sensing Program. This study is providing long-term analysis of the commercial remote sensing industry. It is being supported by interviews with industry and government experts, a web-based survey, and a thorough review and analysis of related literature. The project should more clearly determine future remote sensing needs and requirements, and maximize the industry's baselines, standards, and socio-economic potential. NOAA, through its participation in this study, has gained important new insights into the status and future trends of this industry. The study's initial findings estimate 2001 industry revenue at 2 billion, growing at 13% per year, to an approximate level of 6 billion in 2010 (in constant, calendar year 2000 dollars). Currently, across all sectors, the most active market segments are in nati onal /glo bal security, mapping/geography, civil government, and have provided for appropriate measures for monitoring and compliance. This approach provides a valuable framework for companies, investors, customers, and foreign partners. The clearly-defined ground rules are designed to facilitate full private sector competition, innovation, and domestic and international market development. International market development remains a key issue for the U.S. Government and for U.S. industry in general. NOAA has learned of some interest by foreign governments in promulgating new laws and regulations to address this growing industry. However, to date, most governments have yet to publicize new commercial remote sensing laws or regulations. In some instances, data policies for commercial remote sensing have been developed, but only in the context of government-owned and operated systems, or private systems in which a government is the controlling shareholder. Other than some initial consultations and limited agreements between supplier nations, there has to date been little overall international coordination of commercial remote sensing policies and practices. The result has been an uncertain and non- uniform international business environment, which can cause difficulties for all commercial remote sensing operators. Related international market distortions inhibit the maturation of the industry and the normalization of business practices. This situation may make it more difficult for key stakeholders to make decisions on investments, purchases, regulatory affairs, and international partnerships. To put this growing industry on a more level footing, there should be further coordination

  10. A Practical Test Method for Mode I Fracture Toughness of Adhesive Joints with Dissimilar Substrates

    SciTech Connect

    Boeman, R.G.; Erdman, D.L.; Klett, L.B.; Lomax, R.D.

    1999-09-27

    A practical test method for determining the mode I fracture toughness of adhesive joints with dissimilar substrates will be discussed. The test method is based on the familiar Double Cantilever Beam (DCB) specimen geometry, but overcomes limitations in existing techniques that preclude their use when testing joints with dissimilar substrates. The test method is applicable to adhesive joints where the two bonded substrates have different flexural rigidities due to geometric and/or material considerations. Two specific features discussed are the use of backing beams to prevent substrate damage and a compliance matching scheme to achieve symmetric loading conditions. The procedure is demonstrated on a modified DCB specimen comprised of SRIM composite and thin-section, e-coat steel substrates bonded with an epoxy adhesive. Results indicate that the test method provides a practical means of characterizing the mode I fracture toughness of joints with dissimilar substrates.

  11. Acetone preservation: a practical technique for molecular analysis.

    PubMed

    Fukatsu, T

    1999-11-01

    In attempts to establish a convenient and reliable method for field collection and archival preservation of insects and their endosymbiotic microorganisms for molecular analysis, acetone, ethanol, and other organic solvents were tested for DNA preservability of the pea aphid Acyrthosiphon pisum and its intracellular symbiotic bacterium Buchnera sp. After 6 months' storage, not only the band of high-molecular-size DNA but also the bands of rRNA were well preserved in acetone, ethanol, 2-propanol, diethyl ether and ethyl acetate. Polymerase chain reaction (PCR) assays confirmed that the DNA of both the insects and their symbionts was well preserved in these solvents. In contrast, methanol and chloroform showed poor DNA preservability. When water-containing series of acetone and ethanol were examined for DNA preservability, acetone was apparently more robust against water contamination than ethanol. Considering that most biological materials contain high amounts of water, acetone may be a more recommendable preservative for DNA analysis than ethanol which has been widely used for this purpose. The DNA of various insects could be preserved in acetone at room temperature in good condition for several years. In addition to the DNA of the host insects, the DNA of their endosymbionts, including Buchnera and other mycetocyte symbionts, Wolbachia, and gut bacteria, was amplified by PCR after several years of acetone storage. The RNA and protein of the pea aphid and its endosymbiont were also preserved for several years in acetone. After 2 years' storage in acetone, proteins of A. pisum could be analysed by sodium dodecyl sulphate-polyacrylamide gel electrophoresis (SDS-PAGE) and immunoblotting, and the endosymbiotic bacteria were successfully detected by immunohistochemistry and in situ hybridization on the tissue sections. PMID:10620236

  12. Analysis of methods. [information systems evolution environment

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity Relationship, Data Flow Diagrams, and Structure Charts, for inclusion in an integrated development support environment.

  13. Self-Assessment Methods in Writing Instruction: A Conceptual Framework, Successful Practices and Essential Strategies

    ERIC Educational Resources Information Center

    Nielsen, Kristen

    2014-01-01

    Student writing achievement is essential to lifelong learner success, but supporting writing can be challenging for teachers. Several large-scale analyses of publications on writing have called for further study of instructional methods, as the current literature does not sufficiently address the need to support best teaching practices.…

  14. The Delphi Method: An Approach for Facilitating Evidence Based Practice in Athletic Training

    ERIC Educational Resources Information Center

    Sandrey, Michelle A.; Bulger, Sean M.

    2008-01-01

    Objective: The growing importance of evidence based practice in athletic training is necessitating academics and clinicians to be able to make judgments about the quality or lack of the body of research evidence and peer-reviewed standards pertaining to clinical questions. To assist in the judgment process, consensus methods, namely brainstorming,…

  15. What Informs Practice and What Is Valued in Corporate Instructional Design? A Mixed Methods Study

    ERIC Educational Resources Information Center

    Thompson-Sellers, Ingrid N.

    2012-01-01

    This study used a two-phased explanatory mixed-methods design to explore in-depth what factors are perceived by Instructional Design and Technology (IDT) professionals as impacting instructional design practice, how these factors are valued in the field, and what differences in perspectives exist between IDT managers and non-managers. For phase 1…

  16. Short-Term, High Intensity Reading Practice Methods for Upward Bound Students: An Appraisal.

    ERIC Educational Resources Information Center

    Burley, JoAnne E.

    1980-01-01

    Reports on a study comparing four reading practice methods (sustained silent reading, programed textbooks, programed cassette tapes, and programed skill development kits) used with Upward Bound students. Sustained silent reading was found to be significantly more successful in improving educationally deprived students' literal and inferential…

  17. Econometric Methods for Causal Evaluation of Education Policies and Practices: A Non-Technical Guide

    ERIC Educational Resources Information Center

    Schlotter, Martin; Schwerdt, Guido; Woessmann, Ludger

    2011-01-01

    Education policy-makers and practitioners want to know which policies and practices can best achieve their goals. But research that can inform evidence-based policy often requires complex methods to distinguish causation from accidental association. Avoiding econometric jargon and technical detail, this paper explains the main idea and intuition…

  18. The Delphi Method: An Approach for Facilitating Evidence Based Practice in Athletic Training

    ERIC Educational Resources Information Center

    Sandrey, Michelle A.; Bulger, Sean M.

    2008-01-01

    Objective: The growing importance of evidence based practice in athletic training is necessitating academics and clinicians to be able to make judgments about the quality or lack of the body of research evidence and peer-reviewed standards pertaining to clinical questions. To assist in the judgment process, consensus methods, namely brainstorming,

  19. Passive Sampling Methods for Contaminated Sediments: Practical Guidance for Selection, Calibration, and Implementation

    EPA Science Inventory

    This article provides practical guidance on the use of passive sampling methods(PSMs) that target the freely dissolved concentration (Cfree) for improved exposure assessment of hydrophobic organic chemicals in sediments. Primary considerations for selecting a PSM for a specific a...

  20. Connecting Practice, Theory and Method: Supporting Professional Doctoral Students in Developing Conceptual Frameworks

    ERIC Educational Resources Information Center

    Kumar, Swapna; Antonenko, Pavlo

    2014-01-01

    From an instrumental view, conceptual frameworks that are carefully assembled from existing literature in Educational Technology and related disciplines can help students structure all aspects of inquiry. In this article we detail how the development of a conceptual framework that connects theory, practice and method is scaffolded and facilitated…

  1. Pharmacogenetics: practices and opportunities for study design and data analysis.

    PubMed

    Flynn, Aiden A

    2011-10-01

    Pharmacogenetics (PGx) is increasingly used as a way to target treatment to patients who are most likely to benefit. To date, PGx has shown clinical significance across a few applications but widespread use has been limited by the need for further technical, methodological and practical advances and for educating clinical researchers on the value of PGx. Here, I describe the current scope of PGx research, including recent contributions to prospective study design. A case study is included to demonstrate the limitations of current practice and to describe some practical steps for improving the chances of identifying genetic effects. Additionally, I describe some opportunities for the integration and application of disparate data sources in exploratory PGx research. PMID:21875683

  2. A new method for designing dual foil electron beam forming systems. II. Feasibility of practical implementation of the method

    NASA Astrophysics Data System (ADS)

    Adrich, Przemysław

    2016-05-01

    In Part I of this work a new method for designing dual foil electron beam forming systems was introduced. In this method, an optimal configuration of the dual foil system is found by means of a systematic, automatized scan of system performance in function of its parameters. At each point of the scan, Monte Carlo method is used to calculate the off-axis dose profile in water taking into account detailed and complete geometry of the system. The new method, while being computationally intensive, minimizes the involvement of the designer. In this Part II paper, feasibility of practical implementation of the new method is demonstrated. For this, a prototype software tools were developed and applied to solve a real life design problem. It is demonstrated that system optimization can be completed within few hours time using rather moderate computing resources. It is also demonstrated that, perhaps for the first time, the designer can gain deep insight into system behavior, such that the construction can be simultaneously optimized in respect to a number of functional characteristics besides the flatness of the off-axis dose profile. In the presented example, the system is optimized in respect to both, flatness of the off-axis dose profile and the beam transmission. A number of practical issues related to application of the new method as well as its possible extensions are discussed.

  3. The Frankfurt Patient Safety Climate Questionnaire for General Practices (FraSiK): analysis of psychometric properties.

    PubMed

    Hoffmann, Barbara; Domanska, Olga Maria; Albay, Zeycan; Mueller, Vera; Guethlin, Corina; Thomas, Eric J; Gerlach, Ferdinand M

    2011-09-01

    BACKGROUND Safety culture has been identified as having a major impact on how safety is managed in healthcare. However, it has not received much attention in general practices. Hence, no instrument yet exists to assess safety climate-the measurable artefact of safety culture-in this setting. This study aims to evaluate psychometric properties of a newly developed safety climate questionnaire for use in German general practices. METHODS The existing Safety Attitudes Questionnaire, Ambulatory Version, was considerably modified and enhanced in order to be applicable in general practice. After pilot tests and its application in a random sample of 400 German practices, a first psychometric analysis led to modifications in several items. A further psychometric analysis was conducted with an additional sample of 60 practices and a response rate of 97.08%. Exploratory factor analysis with orthogonal varimax rotation was carried out and the internal consistency of the identified factors was calculated. RESULTS Nine factors emerged, representing a wide range of dimensions associated with safety culture: teamwork climate, error management, safety of clinical processes, perception of causes of errors, job satisfaction, safety of office structure, receptiveness to healthcare assistants and patients, staff perception of management, and quality and safety of medical care. Internal consistency of factors is moderate to good. CONCLUSIONS This study demonstrates the development of a patient safety climate instrument. The questionnaire displays established features of safety climate and additionally contains features that might be specific to small-scale general practices. PMID:21571753

  4. Analysis Method for Quantifying Vehicle Design Goals

    NASA Technical Reports Server (NTRS)

    Fimognari, Peter; Eskridge, Richard; Martin, Adam; Lee, Michael

    2007-01-01

    A document discusses a method for using Design Structure Matrices (DSM), coupled with high-level tools representing important life-cycle parameters, to comprehensively conceptualize a flight/ground space transportation system design by dealing with such variables as performance, up-front costs, downstream operations costs, and reliability. This approach also weighs operational approaches based on their effect on upstream design variables so that it is possible to readily, yet defensively, establish linkages between operations and these upstream variables. To avoid the large range of problems that have defeated previous methods of dealing with the complex problems of transportation design, and to cut down the inefficient use of resources, the method described in the document identifies those areas that are of sufficient promise and that provide a higher grade of analysis for those issues, as well as the linkages at issue between operations and other factors. Ultimately, the system is designed to save resources and time, and allows for the evolution of operable space transportation system technology, and design and conceptual system approach targets.

  5. Nonstationary Hydrological Frequency Analysis: Theoretical Methods and Application Challenges

    NASA Astrophysics Data System (ADS)

    Xiong, L.

    2014-12-01

    Because of its great implications in the design and operation of hydraulic structures under changing environments (either climate change or anthropogenic changes), nonstationary hydrological frequency analysis has become so important and essential. Two important achievements have been made in methods. Without adhering to the consistency assumption in the traditional hydrological frequency analysis, the time-varying probability distribution of any hydrological variable can be established by linking the distribution parameters to some covariates such as time or physical variables with the help of some powerful tools like the Generalized Additive Model of Location, Scale and Shape (GAMLSS). With the help of copulas, the multivariate nonstationary hydrological frequency analysis has also become feasible. However, applications of the nonstationary hydrological frequency formula to the design and operation of hydraulic structures for coping with the impacts of changing environments in practice is still faced with many challenges. First, the nonstationary hydrological frequency formulae with time as covariate could only be extrapolated for a very short time period beyond the latest observation time, because such kind of formulae is not physically constrained and the extrapolated outcomes could be unrealistic. There are two physically reasonable methods that can be used for changing environments, one is to directly link the quantiles or the distribution parameters to some measureable physical factors, and the other is to use the derived probability distributions based on hydrological processes. However, both methods are with a certain degree of uncertainty. For the design and operation of hydraulic structures under changing environments, it is recommended that design results of both stationary and nonstationary methods be presented together and compared with each other, to help us understand the potential risks of each method.

  6. Entropy Methods For Univariate Distributions in Decision Analysis

    NASA Astrophysics Data System (ADS)

    Abbas, Ali E.

    2003-03-01

    One of the most important steps in decision analysis practice is the elicitation of the decision-maker's belief about an uncertainty of interest in the form of a representative probability distribution. However, the probability elicitation process is a task that involves many cognitive and motivational biases. Alternatively, the decision-maker may provide other information about the distribution of interest, such as its moments, and the maximum entropy method can be used to obtain a full distribution subject to the given moment constraints. In practice however, decision makers cannot readily provide moments for the distribution, and are much more comfortable providing information about the fractiles of the distribution of interest or bounds on its cumulative probabilities. In this paper we present a graphical method to determine the maximum entropy distribution between upper and lower probability bounds and provide an interpretation for the shape of the maximum entropy distribution subject to fractile constraints, (FMED). We also discuss the problems with the FMED in that it is discontinuous and flat over each fractile interval. We present a heuristic approximation to a distribution if in addition to its fractiles, we also know it is continuous and work through full examples to illustrate the approach.

  7. Influence of analysis methods on interpretation of hazard maps.

    PubMed

    Koehler, Kirsten A; Peters, Thomas M

    2013-06-01

    Exposure or hazard mapping is becoming increasingly popular among industrial hygienists. Direct-reading instruments used for hazard mapping of data collection are steadily increasing in reliability and portability while decreasing in cost. Exposure measurements made with these instruments generally require no laboratory analysis although hazard mapping can be a time-consuming process. To inform decision making by industrial hygienists and management, it is crucial that the maps generated from mapping data are as accurate and representative as possible. Currently, it is unclear how many sampling locations are necessary to produce a representative hazard map. As such, researchers typically collect as many points as can be sampled in several hours and interpolation methods are used to produce higher resolution maps. We have reanalyzed hazard-mapping data sets from three industrial settings to determine which interpolation methods yield the most accurate results. The goal is to provide practicing industrial hygienists with some practical guidelines to generate accurate hazard maps with 'off-the-shelf' mapping software. Visually verifying the fit of the variogram model is crucial for accurate interpolation. Exponential and spherical variogram models performed better than Gaussian models. It was also necessary to diverge from some of the default interpolation parameters such as the number of bins used for the experimental variogram and whether or not to allow for a nugget effect to achieve reasonable accuracy of the interpolation for some data sets. PMID:23258453

  8. Optimisation of Lime-Soda process parameters for reduction of hardness in aqua-hatchery practices using Taguchi methods.

    PubMed

    Yavalkar, S P; Bhole, A G; Babu, P V Vijay; Prakash, Chandra

    2012-04-01

    This paper presents the optimisation of Lime-Soda process parameters for the reduction of hardness in aqua-hatchery practices in the context of M. rosenbergii. The fresh water in the development of fisheries needs to be of suitable quality. Lack of desirable quality in available fresh water is generally the confronting restraint. On the Indian subcontinent, groundwater is the only source of raw water, having varying degree of hardness and thus is unsuitable for the fresh water prawn hatchery practices (M. rosenbergii). In order to make use of hard water in the context of aqua-hatchery, Lime-Soda process has been recommended. The efficacy of the various process parameters like lime, soda ash and detention time, on the reduction of hardness needs to be examined. This paper proposes to determine the parameter settings for the CIFE well water, which is pretty hard by using Taguchi experimental design method. Orthogonal Arrays of Taguchi, Signal-to-Noise Ratio, the analysis of variance (ANOVA) have been applied to determine their dosage and analysed for their effect on hardness reduction. The tests carried out with optimal levels of Lime-Soda process parameters confirmed the efficacy of the Taguchi optimisation method. Emphasis has been placed on optimisation of chemical doses required to reduce the total hardness using Taguchi method and ANOVA, to suit the available raw water quality for aqua-hatchery practices, especially for fresh water prawn M. rosenbergii. PMID:24749379

  9. Multi-Spacecraft Turbulence Analysis Methods

    NASA Astrophysics Data System (ADS)

    Horbury, Tim S.; Osman, Kareem T.

    Turbulence is ubiquitous in space plasmas, from the solar wind to supernova remnants, and on scales from the electron gyroradius to interstellar separations. Turbulence is responsible for transporting energy across space and between scales and plays a key role in plasma heating, particle acceleration and thermalisation downstream of shocks. Just as with other plasma processes such as shocks or reconnection, turbulence results in complex, structured and time-varying behaviour which is hard to measure with a single spacecraft. However, turbulence is a particularly hard phenomenon to study because it is usually broadband in nature: it covers many scales simultaneously. One must therefore use techniques to extract information on multiple scales in order to quantify plasma turbulence and its effects. The Cluster orbit takes the spacecraft through turbulent regions with a range of characteristics: the solar wind, magnetosheath, cusp and magnetosphere. In each, the nature of the turbulence (strongly driven or fully evolved; dominated by kinetic effects or largely on fluid scales), as well as characteristics of the medium (thermalised or not; high or low plasma sub- or super-Alfvenic) mean that particular techniques are better suited to the analysis of Cluster data in different locations. In this chapter, we consider a range of methods and how they are best applied to these different regions. Perhaps the most studied turbulent space plasma environment is the solar wind, see Bruno and Carbone [2005]; Goldstein et al. [2005] for recent reviews. This is the case for a number of reasons: it is scientifically important for cosmic ray and solar energetic particle scattering and propagation, for example. However, perhaps the most significant motivations for studying solar wind turbulence are pragmatic: large volumes of high quality measurements are available; the stability of the solar wind on the scales of hours makes it possible to identify statistically stationary intervals to analyse; and, most important of all, the solar wind speed, V SW , is much higher than the local MHD wave speeds. This means that a spacecraft time series is essentially a "snapshot" spatial sample of the plasma along the flow direction, so we can consider measurements at a set of times ti to be at a set of locations in the plasma given by xi = VSW. This approximation,known as Taylor's hypothesis, greatly simplifies the analysis of the data. In contrast, in the magnetosheath the flow speed is lower than the wave speed and therefore temporal changes at the spacecraft are due to a complex combination of the plasma moving over the spacecraft and the turbulent fluctuations propagating in the plasma frame. This is also the case for ion and electron kinetic scale turbulence in the solar wind and dramatically complicates the analysis of the data. As a result, the application of multi-spacecraft techniques such as k filtering to Cluster data (see Chapter 5, which make it possible to disentangle the effects of flow and wave propagation, have probably resulted in the greatest increase in our understanding of magnetosheath turbulence rather than in the solar wind. We can therefore summarise the key advantages for plasma turbulence analysis of multi-spacecraft data sets such as those from Cluster, compared to single spacecraft data. Multiple sampling points allow us to measure how the turbulence varies in many directions, and on a range of scales, simultaneously, enabling the study of anisotropy in ways that have not previously been possible. They also allow us to distinguish between the motion of fluctuations in the plasma and motion of the plasma itself, enabling the study of turbulence in highly disturbed environments such as the magnetosheath. A number of authors have studied turbulence with Cluster data, using different techniques, the choice of which is motivated by the characteristics of the plasma environment in which they are interested. The complexity of both the Cluster data and the problem of turbulence meant that progress early in the mission was rather limited, although in the last few years several key results have been obtained and it is now a rapidly evolving topic. At this point, it is worth noting briefly the scope of this chapter: we discuss multi- spacecraft Cluster results and methods regarding turbulence at fluid, ion and electron scales, with the emphasis on the methods more than the physical significance of the results, but we do not consider more wave-like phenomena such as those in the foreshock. This is an entirely artificial distinction, both in terms of the physics and the analysis methods. Nevertheless, this chapter is intended to be largely self-contained and we refer the reader to other chapters in this book for more information about these related topics. We also stress that this chapter is not in any way intended to be an introduction to, or overview of, the analysis and theory of space plasma turbulence, or even of Cluster results in general: instead, references to review articles are provided where appropriate. Belmont et al. [2006] discussed the application of k filtering to turbulence studies in much greater depth than is presented here and we refer the reader to that paper for more details. Single space- craft analysis of Cluster data is revealing important information about turbulent anisotropy [e.g., Mangeney et al., 2006; Lacombe et al., 2006], dissipation processes [e.g., Bale et al., 2005] and even evidence for reconnection triggered by turbulence [e.g., Retino et al., 2007] but again, we do not discuss these results further here: our emphasis is on multi-spacecraft analysis methods. After fifty years of spacecraft measurements of turbulent space plasmas, many significant questions remain unanswered. Perhaps the three most important, both for our fundamental understanding of plasma turbulence as a process and for quantifying its large scale effects, are: anisotropy due to the presence of a background magnetic field; the nature of the dissipation process; and the origin of the spatial inhomogeneity known as intermittency. All three of these issues have been addressed using Cluster data. We discuss each briefly here in order to provide the context for the methods and results presented in later sections.

  10. /sup 252/Cf-source-driven neutron noise analysis method

    SciTech Connect

    Mihalczo, J.T.; King, W.T.; Blakeman, E.D.

    1985-01-01

    The /sup 252/Cf-source-driven neutron noise analysis method has been tested in a wide variety of experiments that have indicated the broad range of applicability of the method. The neutron multiplication factor k/sub eff/ has been satisfactorily detemined for a variety of materials including uranium metal, light water reactor fuel pins, fissile solutions, fuel plates in water, and interacting cylinders. For a uranyl nitrate solution tank which is typical of a fuel processing or reprocessing plant, the k/sub eff/ values were satisfactorily determined for values between 0.92 and 0.5 using a simple point kinetics interpretation of the experimental data. The short measurement times, in several cases as low as 1 min, have shown that the development of this method can lead to a practical subcriticality monitor for many in-plant applications. The further development of the method will require experiments oriented toward particular applications including dynamic experiments and the development of theoretical methods to predict the experimental observables.

  11. Survey of Sterile Admixture Practices in Canadian Hospital Pharmacies: Part 1. Methods and Results

    PubMed Central

    Warner, Travis; Nishi, Cesilia; Checkowski, Ryan; Hall, Kevin W.

    2009-01-01

    Background: The 1996 Guidelines for Preparation of Sterile Products in Pharmacies of the Canadian Society of Hospital Pharmacists (CSHP) represent the current standard of practice for sterile compounding in Canada. However, these guidelines are practice recommendations, not enforceable standards. Previous surveys of sterile compounding practices have shown that actual practice deviates markedly from voluntary practice recommendations. In 2004, the United States Pharmacopeia (USP) published its “General Chapter <797> Pharmaceutical Compounding—Sterile Preparations”, which set a more rigorous and enforceable standard for sterile compounding in the United States. Objectives: To assess sterile compounding practices in Canadian hospital pharmacies and to compare them with current CSHP recommendations and USP chapter <797> standards. Methods: An online survey, based on previous studies of sterile compounding practices, the CSHP guidelines, and the chapter <797> standards, was created and distributed to 193 Canadian hospital pharmacies. Results: A total of 133 pharmacies completed at least part of the survey, for a response rate of 68.9%. All respondents reported the preparation of sterile products. Various degrees of deviation from the practice recommendations were noted for virtually all areas of the CSHP guidelines and the USP standards. Low levels of compliance were most notable in the areas of facilities and equipment, process validation, and product testing. Availability in the central pharmacy of a clean room facility meeting or exceeding the criteria of International Organization for Standardization (ISO) class 8 is a requirement of the chapter <797> standards, but more than 40% of responding pharmacies reported that they did not have such a facility. Higher levels of compliance were noted for policies and procedures, garbing requirements, aseptic technique, and handling of hazardous products. Part 1 of this series reports the survey methods and results relating to policies, personnel, raw materials, storage and handling, facilities and equipment, and garments. Part 2 will report results relating to preparation of aseptic products, expiry dating, labelling, process validation, product testing and release, documentation, records, and disposal of hazardous pharmaceuticals. It will also highlight some of the key areas where there is considerable opportunity for improvement. Conclusion: This survey identified numerous deficiences in sterile compounding practices in Canadian hospital pharmacies. Awareness of these deficiencies may create an impetus for critical assessment and improvements in practice. PMID:22478875

  12. PRACTICAL SENSITIVITY AND UNCERTAINTY ANALYSIS TECHNIQUES APPLIED TO AGRICULTURAL SYSTEMS MODELS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We present a practical evaluation framework for analysis of two complex, process-based agricultural system models, WEPP and RZWQM. The evaluation framework combines sensitivity analysis and the uncertainty analysis techniques of first order error analysis (FOA) and Monte Carlo simulation with Latin ...

  13. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  14. A Sociological Analysis of Science Curriculum and Pedagogic Practices

    ERIC Educational Resources Information Center

    Alves, Vanda; Morais, Ana M.

    2012-01-01

    The study analyses the extent to which the sociological message transmitted by the teachers' pedagogic practice recontextualizes the official pedagogic discourse of the natural sciences curriculum for a Portuguese middle school. Theoretically, the study is based on theories of psychology (e.g. Vygotsky), epistemology (e.g. Ziman) and sociology,

  15. Honesty in Critically Reflective Essays: An Analysis of Student Practice

    ERIC Educational Resources Information Center

    Maloney, Stephen; Tai, Joanna Hong-Meng; Lo, Kristin; Molloy, Elizabeth; Ilic, Dragan

    2013-01-01

    In health professional education, reflective practice is seen as a potential means for self-improvement from everyday clinical encounters. This study aims to examine the level of student honesty in critical reflection, and barriers and facilitators for students engaging in honest reflection. Third year physiotherapy students, completing summative…

  16. Mentoring Beginning Teachers in Secondary Schools: An Analysis of Practice

    ERIC Educational Resources Information Center

    Harrison, Jennifer; Dymoke, Sue; Pell, Tony

    2006-01-01

    The conditions that promote best practice in the mentoring of beginning teachers in secondary schools are explored in this paper in relation to the experiential model of learning put forward by Kolb [(1984). "Experiential learning: Experience as the source of learning and development." New York: Prentice-Hall]. The underpinning processes of this…

  17. Professional Learning in Rural Practice: A Sociomaterial Analysis

    ERIC Educational Resources Information Center

    Slade, Bonnie

    2013-01-01

    Purpose: This paper aims to examine the professional learning of rural police officers. Design/methodology/approach: This qualitative case study involved interviews and focus groups with 34 police officers in Northern Scotland. The interviews and focus groups were transcribed and analysed, drawing on practice-based and sociomaterial learning…

  18. A Sociological Analysis of Science Curriculum and Pedagogic Practices

    ERIC Educational Resources Information Center

    Alves, Vanda; Morais, Ana M.

    2012-01-01

    The study analyses the extent to which the sociological message transmitted by the teachers' pedagogic practice recontextualizes the official pedagogic discourse of the natural sciences curriculum for a Portuguese middle school. Theoretically, the study is based on theories of psychology (e.g. Vygotsky), epistemology (e.g. Ziman) and sociology,…

  19. A Preliminary Analysis of Early Rhythm and Blues Musical Practices.

    ERIC Educational Resources Information Center

    Meadows, Eddie S.

    1983-01-01

    Presents background information on the evolution of rhythm and blues (R & B) from the 1940s to the 1960s: the origin and naming of selected R & B groups, role of instruments in R & B orchestras, soloist/group vocal practices, and the role that independent record labels played in artists' successes and failures. (Author/ML)

  20. Incorporating Computer-Aided Language Sample Analysis into Clinical Practice

    ERIC Educational Resources Information Center

    Price, Lisa Hammett; Hendricks, Sean; Cook, Colleen

    2010-01-01

    Purpose: During the evaluation of language abilities, the needs of the child are best served when multiple types and sources of data are included in the evaluation process. Current educational policies and practice guidelines further dictate the use of authentic assessment data to inform diagnosis and treatment planning. Language sampling and…

  1. Incorporating Computer-Aided Language Sample Analysis into Clinical Practice

    ERIC Educational Resources Information Center

    Price, Lisa Hammett; Hendricks, Sean; Cook, Colleen

    2010-01-01

    Purpose: During the evaluation of language abilities, the needs of the child are best served when multiple types and sources of data are included in the evaluation process. Current educational policies and practice guidelines further dictate the use of authentic assessment data to inform diagnosis and treatment planning. Language sampling and

  2. An Analysis of Teacher Practices with Toddlers during Social Conflicts

    ERIC Educational Resources Information Center

    Gloeckler, Lissy R.; Cassell, Jennifer M.; Malkus, Amy J.

    2014-01-01

    Employing a quasi-experimental design, this pilot study on teacher practices with toddlers during social conflicts was conducted in the southeastern USA. Four child-care classrooms, teachers (n?=?8) and children (n?=?51) were assessed with the Classroom Assessment Scoring System -- Toddler [CLASS-Toddler; La Paro, K., Hamre, B. K., & Pianta,…

  3. An Analysis of Teacher Practices with Toddlers during Social Conflicts

    ERIC Educational Resources Information Center

    Gloeckler, Lissy R.; Cassell, Jennifer M.; Malkus, Amy J.

    2014-01-01

    Employing a quasi-experimental design, this pilot study on teacher practices with toddlers during social conflicts was conducted in the southeastern USA. Four child-care classrooms, teachers (n?=?8) and children (n?=?51) were assessed with the Classroom Assessment Scoring System -- Toddler [CLASS-Toddler; La Paro, K., Hamre, B. K., & Pianta,

  4. Honesty in Critically Reflective Essays: An Analysis of Student Practice

    ERIC Educational Resources Information Center

    Maloney, Stephen; Tai, Joanna Hong-Meng; Lo, Kristin; Molloy, Elizabeth; Ilic, Dragan

    2013-01-01

    In health professional education, reflective practice is seen as a potential means for self-improvement from everyday clinical encounters. This study aims to examine the level of student honesty in critical reflection, and barriers and facilitators for students engaging in honest reflection. Third year physiotherapy students, completing summative

  5. Conceptualizing Alberta District Leadership Practices: A Cross-Case Analysis

    ERIC Educational Resources Information Center

    Bedard, George J.; Mombourquette, Carmen P.

    2015-01-01

    We interviewed 45 district-level staff, principals, and trustees in two high-performing and one rapidly improving Alberta school districts. We asked interviewees to detail the what and the how of key leadership practices to promote and sustain student achievement and how they had changed over the last five to ten years. The cross-case findings are…

  6. Conceptualizing Alberta District Leadership Practices: A Cross-Case Analysis

    ERIC Educational Resources Information Center

    Bedard, George J.; Mombourquette, Carmen P.

    2015-01-01

    We interviewed 45 district-level staff, principals, and trustees in two high-performing and one rapidly improving Alberta school districts. We asked interviewees to detail the what and the how of key leadership practices to promote and sustain student achievement and how they had changed over the last five to ten years. The cross-case findings are

  7. Practical guidelines to select and scale earthquake records for nonlinear response history analysis of structures

    USGS Publications Warehouse

    Kalkan, Erol; Chopra, Anil K.

    2010-01-01

    Earthquake engineering practice is increasingly using nonlinear response history analysis (RHA) to demonstrate performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. Presented herein is a modal-pushover-based scaling (MPS) method to scale ground motions for use in nonlinear RHA of buildings and bridges. In the MPS method, the ground motions are scaled to match (to a specified tolerance) a target value of the inelastic deformation of the first-'mode' inelastic single-degree-of-freedom (SDF) system whose properties are determined by first-'mode' pushover analysis. Appropriate for first-?mode? dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-'mode' SDF system in selecting a subset of the scaled ground motions. Based on results presented for two bridges, covering single- and multi-span 'ordinary standard' bridge types, and six buildings, covering low-, mid-, and tall building types in California, the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.

  8. A Comparative Analysis of Ethnomedicinal Practices for Treating Gastrointestinal Disorders Used by Communities Living in Three National Parks (Korea)

    PubMed Central

    Kim, Hyun; Song, Mi-Jang; Brian, Heldenbrand; Choi, Kyoungho

    2014-01-01

    The purpose of this study is to comparatively analyze the ethnomedicinal practices on gastrointestinal disorders within communities in Jirisan National Park, Gayasan National Park, and Hallasan National Park of Korea. Data was collected through participant observations and indepth interviews with semistructured questionnaires. Methods for comparative analysis were accomplished using the informant consensus factor, fidelity level, and internetwork analysis. A total of 490 ethnomedicinal practices recorded from the communities were classified into 110 families, 176 genera, and 220 species that included plants, animals, fungi, and alga. The informant consensus factor values in the disorder categories were enteritis, and gastralgia (1.0), followed by indigestion (0.94), constipation (0.93), and abdominal pain and gastroenteric trouble (0.92). In terms of fidelity levels, 71 plant species showed fidelity levels of 100%. The internetwork analysis between disorders and all medicinal species are grouped in the center by the four categories of indigestion, diarrhea, abdominal pain, and gastroenteric trouble, respectively. Regarding the research method of this study, the comparative analysis methods will contribute to the availability of orally transmitted ethnomedicinal knowledge. Among the methods of analysis, the use of internetwork analysis as a tool for analysis in this study provides imperative internetwork maps between gastrointestinal disorders and medicinal species. PMID:25202330

  9. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  10. Research methods to change clinical practice for patients with rare cancers.

    PubMed

    Billingham, Lucinda; Malottki, Kinga; Steven, Neil

    2016-02-01

    Rare cancers are a growing group as a result of reclassification of common cancers by molecular markers. There is therefore an increasing need to identify methods to assess interventions that are sufficiently robust to potentially affect clinical practice in this setting. Methods advocated for clinical trials in rare diseases are not necessarily applicable in rare cancers. This Series paper describes research methods that are relevant for rare cancers in relation to the range of incidence levels. Strategies that maximise recruitment, minimise sample size, or maximise the usefulness of the evidence could enable the application of conventional clinical trial design to rare cancer populations. Alternative designs that address specific challenges for rare cancers with the aim of potentially changing clinical practice include Bayesian designs, uncontrolled n-of-1 trials, and umbrella and basket trials. Pragmatic solutions must be sought to enable some level of evidence-based health care for patients with rare cancers. PMID:26868356

  11. Gap analysis: Concepts, methods, and recent results

    USGS Publications Warehouse

    Jennings, M.D.

    2000-01-01

    Rapid progress is being made in the conceptual, technical, and organizational requirements for generating synoptic multi-scale views of the earth's surface and its biological content. Using the spatially comprehensive data that are now available, researchers, land managers, and land-use planners can, for the first time, quantitatively place landscape units - from general categories such as 'Forests' or 'Cold-Deciduous Shrubland Formation' to more categories such as 'Picea glauca-Abies balsamea-Populus spp. Forest Alliance' - in their large-area contexts. The National Gap Analysis Program (GAP) has developed the technical and organizational capabilities necessary for the regular production and analysis of such information. This paper provides a brief overview of concepts and methods as well as some recent results from the GAP projects. Clearly, new frameworks for biogeographic information and organizational cooperation are needed if we are to have any hope of documenting the full range of species occurrences and ecological processes in ways meaningful to their management. The GAP experience provides one model for achieving these new frameworks.

  12. Inverse Langmuir method for oligonucleotide microarray analysis

    PubMed Central

    Mulders, Geert CWM; Barkema, Gerard T; Carlon, Enrico

    2009-01-01

    Background An algorithm for the analysis of Affymetrix Genechips is presented. This algorithm, referred to as the Inverse Langmuir Method (ILM), estimates the binding of transcripts to complementary probes using DNA/RNA hybridization free energies, and the hybridization between partially complementary transcripts in solution using RNA/RNA free energies. The balance between these two competing reactions allows for the translation of background-subtracted intensities into transcript concentrations. Results To validate the ILM, it is applied to publicly available microarray data from a multi-lab comparison study. Here, microarray experiments are performed on samples which deviate only in few genes. The log2 fold change between these two samples, as obtained from RT-PCR experiments, agrees well with the log2 fold change as obtained with the ILM, indicating that the ILM determines changes in the expression level accurately. We also show that the ILM allows for the identification of outlying probes, as it yields independent concentration estimates per probe. Conclusion The ILM is robust and offers an interesting alternative to purely statistical algorithms for microarray data analysis. PMID:19232092

  13. Method and apparatus for frequency spectrum analysis

    NASA Technical Reports Server (NTRS)

    Cole, Steven W. (Inventor)

    1992-01-01

    A method for frequency spectrum analysis of an unknown signal in real-time is discussed. The method is based upon integration of 1-bit samples of signal voltage amplitude corresponding to sine or cosine phases of a controlled center frequency clock which is changed after each integration interval to sweep the frequency range of interest in steps. Integration of samples during each interval is carried out over a number of cycles of the center frequency clock spanning a number of cycles of an input signal to be analyzed. The invention may be used to detect the frequency of at least two signals simultaneously. By using a reference signal of known frequency and voltage amplitude (added to the two signals for parallel processing in the same way, but in a different channel with a sampling at the known frequency and phases of the reference signal), the absolute voltage amplitude of the other two signals may be determined by squaring the sine and cosine integrals of each channel and summing the squares to obtain relative power measurements in all three channels and, from the known voltage amplitude of the reference signal, obtaining an absolute voltage measurement for the other two signals by multiplying the known voltage of the reference signal with the ratio of the relative power of each of the other two signals to the relative power of the reference signal.

  14. Integrated force method versus displacement method for finite element analysis

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Berke, Laszlo; Gallagher, Richard H.

    1990-01-01

    A novel formulation termed the integrated force method (IFM) has been developed in recent years for analyzing structures. In this method all the internal forces are taken as independent variables, and the system equilibrium equations (EE's) are integrated with the global compatibility conditions (CC's) to form the governing set of equations. In IFM the CC's are obtained from the strain formulation of St. Venant, and no choices of redundant load systems have to be made, in constrast to the standard force method (SFM). This property of IFM allows the generation of the governing equation to be automated straightforwardly, as it is in the popular stiffness method (SM). In this report IFM and SM are compared relative to the structure of their respective equations, their conditioning, required solution methods, overall computational requirements, and convergence properties as these factors influence the accuracy of the results. Overall, this new version of the force method produces more accurate results than the stiffness method for comparable computational cost.

  15. Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis

    PubMed Central

    Critchfield, Thomas S

    2011-01-01

    Neither practitioners nor scientists appear to be fully satisfied with the world's largest behavior-analytic membership organization. Each community appears to believe that initiatives that serve the other will undermine the association's capacity to serve their own needs. Historical examples suggest that such discord is predicted when practitioners and scientists cohabit the same association. This is true because all professional associations exist to address guild interests, and practice and science are different professions with different guild interests. No association, therefore, can succeed in being all things to all people. The solution is to assure that practice and science communities are well served by separate professional associations. I comment briefly on how this outcome might be promoted. PMID:22532750

  16. STUDY ON PRACTICAL ULTRASONIC INSPECTION METHOD FOR FATIGUE CRACKS IN STEEL ORTHOTROPIC DECKPLATES

    NASA Astrophysics Data System (ADS)

    Murakoshi, Jun; Takahashi, Minoru; Koike, Mitsuhiro; Kimura, Tomonori

    Fatigue cracks have been recently reported at weld root of deckplate-U type rib connection in orthotropic steel deck bridges on heavy traffic route. These cracks propagate to the upper surface of deckplate, that makes it difficult to detect them during visual inspection. With the purpose of developing a reliable and practical non-destructive inspection method for the cracks, this paper discusses an ultrasonic testing method using SV waves by critical angle beam probe. A reliable technique for the sensitivity calibration was proposed. Based on ultrasonic testing for fatigue crack specimens and a damaged deckplate on actual bridge, applicability of the proposed ultrasonic inspection method was confirmed.

  17. A practical method to determine the heating and cooling curves of x-ray tube assemblies

    SciTech Connect

    Bottaro, M.; Moralles, M.; Viana, V.; Donatiello, G. L.; Silva, E. P.

    2007-10-15

    A practical method to determine the heating and cooling curves of x-ray tube assemblies with rotating anode x-ray tube is proposed. Available procedures to obtain these curves as described in the literature are performed during operation of the equipment, and the precision of the method depends on the knowledge of the total energy applied in the system. In the present work we describe procedures which use a calorimetric system and do not require the operation of the x-ray equipment. The method was applied successfully to a x-ray tube assembly that was under test in our laboratory.

  18. Meaning and challenges in the practice of multiple therapeutic massage modalities: a combined methods study

    PubMed Central

    2011-01-01

    Background Therapeutic massage and bodywork (TMB) practitioners are predominantly trained in programs that are not uniformly standardized, and in variable combinations of therapies. To date no studies have explored this variability in training and how this affects clinical practice. Methods Combined methods, consisting of a quantitative, population-based survey and qualitative interviews with practitioners trained in multiple therapies, were used to explore the training and practice of TMB practitioners in Alberta, Canada. Results Of the 5242 distributed surveys, 791 were returned (15.1%). Practitioners were predominantly female (91.7%), worked in a range of environments, primarily private (44.4%) and home clinics (35.4%), and were not significantly different from other surveyed massage therapist populations. Seventy-seven distinct TMB therapies were identified. Most practitioners were trained in two or more therapies (94.4%), with a median of 8 and range of 40 therapies. Training programs varied widely in number and type of TMB components, training length, or both. Nineteen interviews were conducted. Participants described highly variable training backgrounds, resulting in practitioners learning unique combinations of therapy techniques. All practitioners reported providing individualized patient treatment based on a responsive feedback process throughout practice that they described as being critical to appropriately address the needs of patients. They also felt that research treatment protocols were different from clinical practice because researchers do not usually sufficiently acknowledge the individualized nature of TMB care provision. Conclusions The training received, the number of therapies trained in, and the practice descriptors of TMB practitioners are all highly variable. In addition, clinical experience and continuing education may further alter or enhance treatment techniques. Practitioners individualize each patient's treatment through a highly adaptive process. Therefore, treatment provision is likely unique to each practitioner. These results may be of interest to researchers considering similar practice issues in other professions. The use of a combined-methods design effectively captured this complexity of TMB practice. TMB research needs to consider research approaches that can capture or adapt to the individualized nature of practice. PMID:21929823

  19. Structural and practical identifiability analysis of S-system.

    PubMed

    Zhan, Choujun; Li, Benjamin Yee Shing; Yeung, Lam Fat

    2015-12-01

    In the field of systems biology, biological reaction networks are usually modelled by ordinary differential equations. A sub-class, the S-systems representation, is a widely used form of modelling. Existing S-systems identification techniques assume that the system itself is always structurally identifiable. However, due to practical limitations, biological reaction networks are often only partially measured. In addition, the captured data only covers a limited trajectory, therefore data can only be considered as a local snapshot of the system responses with respect to the complete set of state trajectories over the entire state space. Hence the estimated model can only reflect partial system dynamics and may not be unique. To improve the identification quality, the structural and practical identifiablility of S-system are studied. The S-system is shown to be identifiable under a set of assumptions. Then, an application on yeast fermentation pathway was conducted. Two case studies were chosen; where the first case is based on a larger state trajectories and the second case is based on a smaller one. By expanding the dataset which span a relatively larger state space, the uncertainty of the estimated system can be reduced. The results indicated that initial concentration is related to the practical identifiablity. PMID:26577163

  20. A cross-sectional mixed methods study protocol to generate learning from patient safety incidents reported from general practice

    PubMed Central

    Carson-Stevens, Andrew; Hibbert, Peter; Avery, Anthony; Butlin, Amy; Carter, Ben; Cooper, Alison; Evans, Huw Prosser; Gibson, Russell; Luff, Donna; Makeham, Meredith; McEnhill, Paul; Panesar, Sukhmeet S; Parry, Gareth; Rees, Philippa; Shiels, Emma; Sheikh, Aziz; Ward, Hope Olivia; Williams, Huw; Wood, Fiona; Donaldson, Liam; Edwards, Adrian

    2015-01-01

    Introduction Incident reports contain descriptions of errors and harms that occurred during clinical care delivery. Few observational studies have characterised incidents from general practice, and none of these have been from the England and Wales National Reporting and Learning System. This study aims to describe incidents reported from a general practice care setting. Methods and analysis A general practice patient safety incident classification will be developed to characterise patient safety incidents. A weighted-random sample of 12 500 incidents describing no harm, low harm and moderate harm of patients, and all incidents describing severe harm and death of patients will be classified. Insights from exploratory descriptive statistics and thematic analysis will be combined to identify priority areas for future interventions. Ethics and dissemination The need for ethical approval was waivered by the Aneurin Bevan University Health Board research risk review committee given the anonymised nature of data (ABHB R&D Ref number: SA/410/13). The authors will submit the results of the study to relevant journals and undertake national and international oral presentations to researchers, clinicians and policymakers. PMID:26628526

  1. Accommodations for Patients with Disabilities in Primary Care: A Mixed Methods Study of Practice Administrators

    PubMed Central

    Pharr, Jennifer R

    2014-01-01

    Structural barriers that limit access to health care services for people with disabilities have been identified through qualitative studies; however, little is known about how patients with disabilities are accommodated in the clinical setting when a structural barrier is encountered. The purpose of this study was to identify how primary care medical practices in the United States accommodated people with disabilities when a barrier to service is encountered. Primary care practice administrators from the medical management organization were identified through the organization’s website. Sixty-three administrators from across the US participated in this study. Practice administrators reported that patients were examined in their wheelchairs (76%), that parts of the exam where skipped when a barrier was encountered (44%), that patients were asked to bring someone with them (52.4%) or that patients were refused treatment due to an inaccessible clinic (3.2%). These methods of accommodation would not be in compliance with requirements of the Americans with Disabilities Act. There was not a significant difference (p>0.05) in accommodations for patients with disabilities between administrators who could describe the application of the ADA to their clinic and those who could not. Practice administrators need a comprehensive understanding of the array of challenges encountered by patients with disabilities throughout the health care process and of how to best accommodate patients with disabilities in their practice. PMID:24373261

  2. Comparative analysis of the methods for SADT determination.

    PubMed

    Kossoy, A A; Sheinman, I Ya

    2007-04-11

    The self-accelerating decomposition temperature (SADT) is an important parameter that characterizes thermal safety at transport of self-reactive substances. A great many articles were published focusing on various methodological aspects of SADT determination. Nevertheless there remain several serious problems that require further analysis and solution. Some of them are considered in the paper. Firstly four methods suggested by the United Nations "Recommendations on the Transport of Dangerous Goods" (TDG) are surveyed in order to reveal their features and limitations. The inconsistency between two definitions of SADT is discussed afterwards. One definition is the basis for the US SADT test and the heat accumulation storage test (Dewar test), another one is used when the Adiabatic storage test or the Isothermal storage test are applied. It is shown that this inconsistency may result in getting different and, in some cases, unsafe estimates of SADT. Then the applicability of the Dewar test for determination of SADT for solids is considered. It is shown that this test can be restrictedly applied for solids provided that the appropriate scale-up procedure is available. The advanced method based on the theory of regular cooling mode is proposed, which ensures more reliable results of the Dewar test application. The last part of the paper demonstrates how the kinetics-based simulation method helps in evaluation of SADT in those complex but practical cases (in particular, stack of packagings) when neither of the methods recommended by TDG can be used. PMID:16889892

  3. Flutter and Divergence Analysis using the Generalized Aeroelastic Analysis Method

    NASA Technical Reports Server (NTRS)

    Edwards, John W.; Wieseman, Carol D.

    2003-01-01

    The Generalized Aeroelastic Analysis Method (GAAM) is applied to the analysis of three well-studied checkcases: restrained and unrestrained airfoil models, and a wing model. An eigenvalue iteration procedure is used for converging upon roots of the complex stability matrix. For the airfoil models, exact root loci are given which clearly illustrate the nature of the flutter and divergence instabilities. The singularities involved are enumerated, including an additional pole at the origin for the unrestrained airfoil case and the emergence of an additional pole on the positive real axis at the divergence speed for the restrained airfoil case. Inconsistencies and differences among published aeroelastic root loci and the new, exact results are discussed and resolved. The generalization of a Doublet Lattice Method computer code is described and the code is applied to the calculation of root loci for the wing model for incompressible and for subsonic flow conditions. The error introduced in the reduction of the singular integral equation underlying the unsteady lifting surface theory to a linear algebraic equation is discussed. Acknowledging this inherent error, the solutions of the algebraic equation by GAAM are termed 'exact.' The singularities of the problem are discussed and exponential series approximations used in the evaluation of the kernel function shown to introduce a dense collection of poles and zeroes on the negative real axis. Again, inconsistencies and differences among published aeroelastic root loci and the new 'exact' results are discussed and resolved. In all cases, aeroelastic flutter and divergence speeds and frequencies are in good agreement with published results. The GAAM solution procedure allows complete control over Mach number, velocity, density, and complex frequency. Thus all points on the computed root loci can be matched-point, consistent solutions without recourse to complex mode tracking logic or dataset interpolation, as in the k and p-k solution methods.

  4. Infant-feeding practices among African American women: social-ecological analysis and implications for practice.

    PubMed

    Reeves, Elizabeth A; Woods-Giscombé, Cheryl L

    2015-05-01

    Despite extensive evidence supporting the health benefits of breastfeeding, significant disparities exist between rates of breastfeeding among African American women and women of other races. Increasing rates of breastfeeding among African American women can contribute to the improved health of the African American population by decreasing rates of infant mortality and disease and by enhancing cognitive development. Additionally, higher rates of breastfeeding among African American women could foster maternal-child bonding and could contribute to stronger families, healthier relationships, and emotionally healthier adults. The purpose of this article is twofold: (a) to use the social-ecological model to explore the personal, socioeconomic, psychosocial, and cultural factors that affect the infant feeding decision-making processes of African American women and (b) to discuss the implications of these findings for clinical practice and research to eliminate current disparities in rates of breastfeeding. PMID:24810518

  5. Comparison and Cost Analysis of Drinking Water Quality Monitoring Requirements versus Practice in Seven Developing Countries

    PubMed Central

    Crocker, Jonny; Bartram, Jamie

    2014-01-01

    Drinking water quality monitoring programs aim to support provision of safe drinking water by informing water quality management. Little evidence or guidance exists on best monitoring practices for low resource settings. Lack of financial, human, and technological resources reduce a country’s ability to monitor water supply. Monitoring activities were characterized in Cambodia, Colombia, India (three states), Jordan, Peru, South Africa, and Uganda according to water sector responsibilities, monitoring approaches, and marginal cost. The seven study countries were selected to represent a range of low resource settings. The focus was on monitoring of microbiological parameters, such as E. coli, coliforms, and H2S-producing microorganisms. Data collection involved qualitative and quantitative methods. Across seven study countries, few distinct approaches to monitoring were observed, and in all but one country all monitoring relied on fixed laboratories for sample analysis. Compliance with monitoring requirements was highest for operational monitoring of large water supplies in urban areas. Sample transport and labor for sample collection and analysis together constitute approximately 75% of marginal costs, which exclude capital costs. There is potential for substantive optimization of monitoring programs by considering field-based testing and by fundamentally reconsidering monitoring approaches for non-piped supplies. This is the first study to look quantitatively at water quality monitoring practices in multiple developing countries. PMID:25046632

  6. Health Education Specialist Practice Analysis 2015 (HESPA 2015): Process and Outcomes.

    PubMed

    McKenzie, James F; Dennis, Dixie; Auld, M Elaine; Lysoby, Linda; Doyle, Eva; Muenzen, Patricia M; Caro, Carla M; Kusorgbor-Narh, Cynthia S

    2016-06-01

    The Health Education Specialist Practice Analysis 2015 (HESPA 2015) was conducted to update and validate the Areas of Responsibilities, Competencies, and Sub-competencies for Entry- and Advanced-Level Health Education Specialists. Two data collection instruments were developed-one was focused on Sub-competencies and the other on knowledge items related to the practice of health education. Instruments were administered to health education specialists (N = 3,152) using online survey methods. A total of 2,508 survey participants used 4-point ordinal scales to rank Sub-competencies by frequency of use and importance. The other 644 participants used the same 4-point frequency scale to rank related knowledge items. Composite scores for Sub-competencies were calculated and subgroup comparisons were conducted that resulted in the validation of 7 Areas of Responsibilities, 36 Competencies, and 258 Sub-competencies. Of the Sub-competencies, 141 were identified as Entry-level, 76 Advanced 1-level, and 41 Advanced 2-level. In addition, 131 knowledge items were verified. The HESPA 2015 findings are compared with the results of the Health Education Job Analysis 2010 and will be useful to those involved in professional preparation, continuing education, and employment of health education specialists. PMID:27107427

  7. Comparison and cost analysis of drinking water quality monitoring requirements versus practice in seven developing countries.

    PubMed

    Crocker, Jonny; Bartram, Jamie

    2014-07-01

    Drinking water quality monitoring programs aim to support provision of safe drinking water by informing water quality management. Little evidence or guidance exists on best monitoring practices for low resource settings. Lack of financial, human, and technological resources reduce a country's ability to monitor water supply. Monitoring activities were characterized in Cambodia, Colombia, India (three states), Jordan, Peru, South Africa, and Uganda according to water sector responsibilities, monitoring approaches, and marginal cost. The seven study countries were selected to represent a range of low resource settings. The focus was on monitoring of microbiological parameters, such as E. coli, coliforms, and H2S-producing microorganisms. Data collection involved qualitative and quantitative methods. Across seven study countries, few distinct approaches to monitoring were observed, and in all but one country all monitoring relied on fixed laboratories for sample analysis. Compliance with monitoring requirements was highest for operational monitoring of large water supplies in urban areas. Sample transport and labor for sample collection and analysis together constitute approximately 75% of marginal costs, which exclude capital costs. There is potential for substantive optimization of monitoring programs by considering field-based testing and by fundamentally reconsidering monitoring approaches for non-piped supplies. This is the first study to look quantitatively at water quality monitoring practices in multiple developing countries. PMID:25046632

  8. Comparing the Effect of Concept Mapping and Conventional Methods on Nursing Students’ Practical Skill Score

    PubMed Central

    Rasoul Zadeh, Nasrin; Sadeghi Gandomani, Hamidreza; Delaram, Masoumeh; Parsa Yekta, Zohre

    2015-01-01

    Background: Development of practical skills in the field of nursing education has remained a serious and considerable challenge in nursing education. Moreover, newly graduated nurses may have weak practical skills, which can be a threat to patients’ safety. Objectives: The present study was conducted to compare the effect of concept mapping and conventional methods on nursing students’ practical skills. Patients and Methods: This quasi-experimental study was conducted on 70 nursing students randomly assigned into two groups of 35 people. The intervention group was taught through concept mapping method, while the control group was taught using conventional method. A two-part instrument was used including a demographic information form and a checklist for direct observation of procedural skills. Descriptive statistics, chi-square, independent samples t-tests and paired t-test were used to analyze data. Results: Before education, no significant differences were observed between the two groups in the three skills of cleaning (P = 0.251), injection (P = 0.185) and sterilizing (P = 0.568). The students mean scores were significantly increased after the education and the difference between pre and post intervention of students mean scores were significant in the both groups (P < 0.001). However, after education, in all three skills the mean scores of the intervention group were significantly higher than the control group (P < 0.001). Conclusions: Concept mapping was superior to conventional skill teaching methods. It is suggested to use concept mapping in teaching practical courses such as fundamentals of nursing. PMID:26576441

  9. Developing a preliminary ‘never event’ list for general practice using consensus-building methods

    PubMed Central

    de Wet, Carl; O’Donnell, Catherine; Bowie, Paul

    2014-01-01

    Background The ‘never event’ concept has been implemented in many acute hospital settings to help prevent serious patient safety incidents. Benefits include increasing awareness of highly important patient safety risks among the healthcare workforce, promoting proactive implementation of preventive measures, and facilitating incident reporting. Aim To develop a preliminary list of never events for general practice. Design and setting Application of a range of consensus-building methods in Scottish and UK general practices. Method A total of 345 general practice team members suggested potential never events. Next, ‘informed’ staff (n =15) developed criteria for defining never events and applied the criteria to create a list of candidate never events. Finally, UK primary care patient safety ‘experts’ (n = 17) reviewed, refined, and validated a preliminary list via a modified Delphi group and by completing a content validity index exercise. Results There were 721 written suggestions received as potential never events. Thematic categorisation reduced this to 38. Five criteria specific to general practice were developed and applied to produce 11 candidate never events. The expert group endorsed a preliminary list of 10 items with a content validity index (CVI) score of >80%. Conclusion A preliminary list of never events was developed for general practice through practitioner experience and consensus-building methods. This is an important first step to determine the potential value of the never event concept in this setting. It is now intended to undertake further testing of this preliminary list to assess its acceptability, feasibility, and potential usefulness as a safety improvement intervention. PMID:24567655

  10. Searching Usenet for Virtual Communities of Practice: Using Mixed Methods to Identify the Constructs of Wenger's Theory

    ERIC Educational Resources Information Center

    Murillo, Enrique

    2008-01-01

    Introduction: This research set out to determine whether communities of practice can be entirely Internet-based by formally applying Wenger's theoretical framework to Internet collectives. Method: A model of a virtual community of practice was developed which included the constructs Wenger identified in co-located communities of practice: mutual…

  11. Empirical Analysis of Green Supply Chain Management Practices in Indian Automobile Industry

    NASA Astrophysics Data System (ADS)

    Luthra, S.; Garg, D.; Haleem, A.

    2014-04-01

    Environmental sustainability and green environmental issues have an increasing popularity among researchers and supply chain practitioners. An attempt has been made to identify and empirically analyze green supply chain management (GSCM) practices in Indian automobile industry. Six main GSCM practices (having 37 sub practices) and four expected performance outcomes (having 16 performances) have been identified by implementing GSCM practices from literature review. Questionnaire based survey has been made to validate these practices and performance outcomes. 123 complete questionnaires were collected from Indian automobile organizations and used for empirical analysis of GSCM practices in Indian automobile industry. Descriptive statistics have been used to know current implementation status of GSCM practices in Indian automobile industry and multiple regression analysis has been carried out to know the impact on expected organizational performance outcomes by current GSCM practices adopted by Indian automobile industry. The results of study suggested that environmental, economic, social and operational performances improve with the implementation of GSCM practices. This paper may play an important role to understand various GSCM implementation issues and help practicing managers to improve their performances in the supply chain.

  12. Common Goals for the Science and Practice of Behavior Analysis: A Response to Critchfield

    ERIC Educational Resources Information Center

    Schneider, Susan M.

    2012-01-01

    In his scholarly and thoughtful article, "Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis," Critchfield (2011) discussed the science-practice frictions to be expected in any professional organization that attempts to combine these interests. He suggested that the Association for Behavior Analysis…

  13. Introducing and Integrating Gifted Education into an Existing Independent School: An Analysis of Practice

    ERIC Educational Resources Information Center

    McKibben, Stephen

    2013-01-01

    In this analysis of practice, I conduct a combination formative and summative program evaluation of an initiative introduced to serve gifted learners at The Ocean School (TOS), an independent, Pre-K-grade 8 day school located in a rural area of the West Coast. Using the best practices as articulated by the National Association of Gifted Children…

  14. Nursing Faculty Decision Making about Best Practices in Test Construction, Item Analysis, and Revision

    ERIC Educational Resources Information Center

    Killingsworth, Erin Elizabeth

    2013-01-01

    With the widespread use of classroom exams in nursing education there is a great need for research on current practices in nursing education regarding this form of assessment. The purpose of this study was to explore how nursing faculty members make decisions about using best practices in classroom test construction, item analysis, and revision in…

  15. A Novel Method for Dissolved Phosphorus Analysis

    NASA Astrophysics Data System (ADS)

    Berry, J. M.; Spiese, C. E.

    2012-12-01

    High phosphorus loading is a major problem in the Great Lakes watershed. Phosphate enters waterways via both point and non-point sources (e.g., runoff, tile drainage, etc.), promoting eutrophication, and ultimately leading to algal blooms, hypoxia and loss of aquatic life. Quantification of phosphorus loading is typically done using the molybdenum blue method, which is known to have significant drawbacks. The molybdenum blue method requires strict control on time, involves toxic reagents that have limited shelf-life, and is generally unable to accurately measure sub-micromolar concentrations. This study aims to develop a novel reagent that will overcome many of these problems. Ethanolic europium(III) chloride and 8-hydroxyquinoline-5-sulfonic acid (hqs) were combined to form the bis-hqs complex (Eu-hqs). Eu-hqs was synthesized as the dipotassium salt via a simple one-pot procedure. This complex was found to be highly fluorescent (λex = 360 nm, λem = 510 nm) and exhibited a linear response upon addition of monohydrogen phosphate. The linear response ranged from 0.5 - 25 μM HPO42- (15.5 - 775 μg P L-1). It was also determined that Eu-hqs formed a 1:1 complex with phosphate. Maximum fluorescence was found at a pH of 8.50, and few interferences from other ions were found. Shelf-life of the reagent was at least one month, twice as long as most of the molybdenum blue reagent formulations. In the future, field tests will be undertaken in local rivers, lakes, and wetlands to determine the applicability of the complex to real-world analysis.

  16. Ad hoc supervision of general practice registrars as a 'community of practice': analysis, interpretation and re-presentation.

    PubMed

    Clement, T; Brown, J; Morrison, J; Nestel, D

    2016-05-01

    General practice registrars in Australia undertake most of their vocational training in accredited general practices. They typically see patients alone from the start of their community-based training and are expected to seek timely ad hoc support from their supervisor. Such ad hoc encounters are a mechanism for ensuring patient safety, but also provide an opportunity for learning and teaching. Wenger's (Communities of practice: learning, meaning, and identity. Cambridge University Press, New York, 1998) social theory of learning ('communities of practice') guided a secondary analysis of audio-recordings of ad hoc encounters. Data from one encounter is re-presented as an extended sequence to maintain congruence with the theoretical perspective and enhance vicariousness. An interpretive commentary communicates key features of Wenger's theory and highlights the researchers' interpretations. We argue that one encounter can reveal universal understandings of clinical supervision and that the process of naturalistic generalisation allows readers to transfer others' experiences to their own contexts. The paper raises significant analytic, interpretive, and representational issues. We highlight that report writing is an important, but infrequently discussed, part of research design. We discuss the challenges of supporting the learning and teaching that arises from adopting a socio-cultural lens and argue that such a perspective importantly captures the complex range of issues that work-based practitioners have to grapple with. This offers a challenge to how we research and seek to influence work-based learning and teaching in health care settings. PMID:26384813

  17. Determination of rate constants for trifluoromethyl radical addition to various alkenes via a practical method.

    PubMed

    Hartmann, M; Li, Y; Studer, A

    2016-01-01

    A simple and practical method for the determination of rate constants for trifluoromethyl radical addition to various alkenes by applying competition kinetics is introduced. In the kinetic experiments the trifluoromethyl radicals are generated in situ from a commercially available hypervalent-iodine-CF3 reagent (Togni-reagent) by SET-reduction with TEMPONa in the presence of TEMPO and a ?-acceptor. From the relative ratio of TEMPOCF3 and CF3-addition product formed, which is readily determined by (19)F-NMR spectroscopy, rate constants for trifluoromethyl radical addition to the ?-acceptor can be calculated. The practical method is also applicable to measure rate constants for the addition of other perfluoroalkyl radicals to alkenes as documented for CF3CF2-radical addition reactions. PMID:26574882

  18. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-11-23

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  19. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-05-25

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  20. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    NASA Technical Reports Server (NTRS)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  1. Language Ideology or Language Practice? An Analysis of Language Policy Documents at Swedish Universities

    ERIC Educational Resources Information Center

    Bjrkman, Beyza

    2014-01-01

    This article presents an analysis and interpretation of language policy documents from eight Swedish universities with regard to intertextuality, authorship and content analysis of the notions of language practices and English as a lingua franca (ELF). The analysis is then linked to Spolsky's framework of language policy, namely language

  2. Language Ideology or Language Practice? An Analysis of Language Policy Documents at Swedish Universities

    ERIC Educational Resources Information Center

    Björkman, Beyza

    2014-01-01

    This article presents an analysis and interpretation of language policy documents from eight Swedish universities with regard to intertextuality, authorship and content analysis of the notions of language practices and English as a lingua franca (ELF). The analysis is then linked to Spolsky's framework of language policy, namely language…

  3. Visual cluster analysis and pattern recognition methods

    DOEpatents

    Osbourn, Gordon Cecil; Martinez, Rubel Francisco

    2001-01-01

    A method of clustering using a novel template to define a region of influence. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques.

  4. Reform-based science teaching: A mixed-methods approach to explaining variation in secondary science teacher practice

    NASA Astrophysics Data System (ADS)

    Jetty, Lauren E.

    The purpose of this two-phase, sequential explanatory mixed-methods study was to understand and explain the variation seen in secondary science teachers' enactment of reform-based instructional practices. Utilizing teacher socialization theory, this mixed-methods analysis was conducted to determine the relative influence of secondary science teachers' characteristics, backgrounds and experiences across their teacher development to explain the range of teaching practices exhibited by graduates from three reform-oriented teacher preparation programs. Data for this study were obtained from the Investigating the Meaningfulness of Preservice Programs Across the Continuum of Teaching (IMPPACT) Project, a multi-university, longitudinal study funded by NSF. In the first quantitative phase of the study, data for the sample (N=120) were collected from three surveys from the IMPPACT Project database. Hierarchical multiple regression analysis was used to examine the separate as well as the combined influence of factors such as teachers' personal and professional background characteristics, beliefs about reform-based science teaching, feelings of preparedness to teach science, school context, school culture and climate of professional learning, and influences of the policy environment on the teachers' use of reform-based instructional practices. Findings indicate three blocks of variables, professional background, beliefs/efficacy, and local school context added significant contribution to explaining nearly 38% of the variation in secondary science teachers' use of reform-based instructional practices. The five variables that significantly contributed to explaining variation in teachers' use of reform-based instructional practices in the full model were, university of teacher preparation, sense of preparation for teaching science, the quality of professional development, science content focused professional, and the perceived level of professional autonomy. Using the results from phase one, the second qualitative phase selected six case study teachers based on their levels of reform-based teaching practices to highlight teachers across the range of practices from low, average, to high levels of implementation. Using multiple interview sources, phase two helped to further explain the variation in levels of reform-based practices. Themes related to teachers' backgrounds, local contexts, and state policy environments were developed as they related to teachers' socialization experiences across these contexts. The results of the qualitative analysis identified the following factors differentiating teachers who enacted reform-based instructional practices from those who did not: 1) extensive science research experiences prior to their preservice teacher preparation; 2) the structure and quality of their field placements; 3) developing and valuing a research-based understanding of teaching and learning as a result of their preservice teacher preparation experiences; 4) the professional culture of their school context where there was support for a high degree of professional autonomy and receiving support from "educational companions" with a specific focus on teacher pedagogy to support student learning; and 5) a greater sense of agency to navigate their districts' interpretation and implementation of state polices. Implications for key stakeholders as well as directions for future research are discussed.

  5. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture, milkfat, and phosphatase levels in cheeses will be determined by the following methods of analysis...

  6. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture, milkfat, and phosphatase levels in cheeses will be determined by the following methods of analysis...

  7. Trends in vasectomy. Analysis of one teaching practice.

    PubMed Central

    Reynolds, J. L.

    1998-01-01

    PROBLEM BEING ADDRESSED: How can a teaching practice develop a referral service and incorporate educational opportunities for family medicine residents, clinical clerks, and community family physicians? OBJECTIVE OF PROGRAM: To develop a high-quality vasectomy service within a teaching practice to change the surgical procedure to the no-scalpel vasectomy (NSV) technique; to educate family medicine residents, clinical clerks, and community family physicians about vasectomy and the NSV technique; and to monitor outcomes and compare them with published results. MAIN COMPONENTS OF PROGRAM: The program took place in an urban family medicine residency program. Data on number of procedures, types of patients choosing vasectomy, and outcomes are presented, along with information on number of learners who viewed, assisted with, or became competent to perform NSV. CONCLUSIONS: A few family medicine residents and some interested community physicians could be trained to perform NSV competently. Involving learners in the procedure does not seem to change the rate of complications. Images Figure 1 PMID:9559195

  8. Echinacea purpurea: Pharmacology, phytochemistry and analysis methods.

    PubMed

    Manayi, Azadeh; Vazirian, Mahdi; Saeidnia, Soodabeh

    2015-01-01

    Echinacea purpurea (Asteraceae) is a perennial medicinal herb with important immunostimulatory and anti-inflammatory properties, especially the alleviation of cold symptoms. The plant also attracted scientists' attention to assess other aspects of its beneficial effects. For instance, antianxiety, antidepression, cytotoxicity, and antimutagenicity as induced by the plant have been revealed in various studies. The findings of the clinical trials are controversial in terms of side effects. While some studies revealed the beneficial effects of the plant on the patients and no severe adverse effects, some others have reported serious side effects including abdominal pain, angioedema, dyspnea, nausea, pruritus, rash, erythema, and urticaria. Other biological activities of the plant such as antioxidant, antibacterial, antiviral, and larvicidal activities have been reported in previous experimental studies. Different classes of secondary metabolites of the plant such as alkamides, caffeic acid derivatives, polysaccharides, and glycoproteins are believed to be biologically and pharmacologically active. Actually, concurrent determination and single analysis of cichoric acid and alkamides have been successfully developed mainly by using high-performance liquid chromatography (HPLC) coupled with different detectors including UV spectrophotometric, coulometric electrochemical, and electrospray ionization mass spectrometric detectors. The results of the studies which were controversial revealed that in spite of major experiments successfully accomplished using E. purpurea, many questions remain unanswered and future investigations may aim for complete recognition of the plant's mechanism of action using new, complementary methods. PMID:26009695

  9. Echinacea purpurea: Pharmacology, phytochemistry and analysis methods

    PubMed Central

    Manayi, Azadeh; Vazirian, Mahdi; Saeidnia, Soodabeh

    2015-01-01

    Echinacea purpurea (Asteraceae) is a perennial medicinal herb with important immunostimulatory and anti-inflammatory properties, especially the alleviation of cold symptoms. The plant also attracted scientists’ attention to assess other aspects of its beneficial effects. For instance, antianxiety, antidepression, cytotoxicity, and antimutagenicity as induced by the plant have been revealed in various studies. The findings of the clinical trials are controversial in terms of side effects. While some studies revealed the beneficial effects of the plant on the patients and no severe adverse effects, some others have reported serious side effects including abdominal pain, angioedema, dyspnea, nausea, pruritus, rash, erythema, and urticaria. Other biological activities of the plant such as antioxidant, antibacterial, antiviral, and larvicidal activities have been reported in previous experimental studies. Different classes of secondary metabolites of the plant such as alkamides, caffeic acid derivatives, polysaccharides, and glycoproteins are believed to be biologically and pharmacologically active. Actually, concurrent determination and single analysis of cichoric acid and alkamides have been successfully developed mainly by using high-performance liquid chromatography (HPLC) coupled with different detectors including UV spectrophotometric, coulometric electrochemical, and electrospray ionization mass spectrometric detectors. The results of the studies which were controversial revealed that in spite of major experiments successfully accomplished using E. purpurea, many questions remain unanswered and future investigations may aim for complete recognition of the plant's mechanism of action using new, complementary methods. PMID:26009695

  10. Critical Discourse Analysis: Discourse Acquisition and Discourse Practices.

    ERIC Educational Resources Information Center

    Price, Steve

    1999-01-01

    Explores arguments around critical-discourse analysis (CDA) and suggests that neither proponents nor critics of CDA have fully come to terms with the implications of what it means to acquire discourse. (Author/VWL)

  11. A sensitive transcriptome analysis method that can detect unknown transcripts

    PubMed Central

    Fukumura, Ryutaro; Takahashi, Hirokazu; Saito, Toshiyuki; Tsutsumi, Yoko; Fujimori, Akira; Sato, Shinji; Tatsumi, Kouichi; Araki, Ryoko; Abe, Masumi

    2003-01-01

    We have developed an AFLP-based gene expression profiling method called ‘high coverage expression profiling’ (HiCEP) analysis. By making improvements to the selective PCR technique we have reduced the rate of false positive peaks to ∼4% and consequently the number of peaks, including overlapping peaks, has been markedly decreased. As a result we can determine the relationship between peaks and original transcripts unequivocally. This will make it practical to prepare a database of all peaks, allowing gene assignment without having to isolate individual peaks. This precise selection also enables us to easily clone peaks of interest and predict the corresponding gene for each peak in some species. The procedure is highly reproducible and sensitive enough to detect even a 1.2-fold difference in gene expression. Most importantly, the low false positive rate enables us to analyze gene expression with wide coverage by means of four instead of six nucleotide recognition site restriction enzymes for fingerprinting mRNAs. Therefore, the method detects 70–80% of all transcripts, including non-coding transcripts, unknown and known genes. Moreover, the method requires no sequence information and so is applicable even to eukaryotes for which there is no genome information available. PMID:12907746

  12. Initial analysis of space target's stealth methods at laser wavelengths

    NASA Astrophysics Data System (ADS)

    Du, Haitao; Han, Yi; Sun, Huayan; Zhang, Tinghua

    2014-12-01

    The laser stealth of space target is useful, important and urgent in practice. This paper introduces the definition expression of laser radar cross section (LRCS) and the general laws of the influencing factors of space target's LRCS, including surface materials types, target's shape and size. Then this paper discusses the possible laser stealth methods of space target in practical applications from the two view points of material stealth methods and shape stealth methods. These conclusions and suggestions can provide references for the next research thinking and methods of the target's laser stealth.

  13. Practical use of three-dimensional inverse method for compressor blade design

    SciTech Connect

    Damle, S.; Dang, T.; Stringham, J.; Razinsky, E.

    1999-04-01

    The practical utility of a three-dimensional inverse viscous method is demonstrated by carrying out a design modification of a first-stage rotor in an industrial compressor. In this design modification study, the goal is to improve the efficiency of the original blade while retaining its overall aerodynamic, structural, and manufacturing characteristics. By employing a simple modification to the blade pressure loading distribution (which is the prescribed flow quantity in this inverse method), the modified blade geometry is predicted to perform better than the original design over a wide range of operating points, including an improvement in choke margin.

  14. Method Development for Analysis of Aspirin Tablets.

    ERIC Educational Resources Information Center

    Street, Kenneth W., Jr.

    1988-01-01

    Develops a lab experiment for introductory instrumental analysis that requires interference studies and optimizing of conditions. Notes the analysis of the aspirin is by visible spectrophotometric assay. Gives experimental details and discussion. (MVL)

  15. Concurrent implementation of the Crank-Nicolson method for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Ransom, J. B.; Fulton, R. E.

    1985-01-01

    To exploit the significant gains in computing speed provided by Multiple Instruction Multiple Data (MIMD) computers, concurrent methods for practical problems need to be investigated and test problems implemented on actual hardware. One such problem class is heat transfer analysis which is important in many aerospace applications. This paper compares the efficiency of two alternate implementations of heat transfer analysis on an experimental MIMD computer called the Finite Element Machine (FEM). The implicit Crank-Nicolson method is used to solve concurrently the heat transfer equations by both iterative and direct methods. Comparison of actual timing results achieved for the two methods and their significance relative to more complex problems are discussed.

  16. Uncertainty Analysis by the "Worst Case" Method.

    ERIC Educational Resources Information Center

    Gordon, Roy; And Others

    1984-01-01

    Presents a new method of uncertainty propagation which concentrates on the calculation of upper and lower limits (the "worst cases"), bypassing absolute and relative uncertainties. Includes advantages of this method and its use in freshmen laboratories, advantages of the traditional method, and a numerical example done by both methods. (JN)

  17. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Methods of analysis. 133.5 Section 133.5 Food and... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture, milkfat, and phosphatase levels in cheeses will be determined by the following methods of analysis...

  18. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL, as issued by the USDA,...

  19. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL as issued by the USDA, Agricultural...

  20. Making health care safer II: an updated critical analysis of the evidence for patient safety practices.

    PubMed Central

    Shekelle, P G; Wachter, R M; Pronovost, P J; Schoelles, K; McDonald, K M; Dy, S M; Shojania, K; Reston, J; Berger, Z; Johnsen, B; Larkin, J W; Lucas, S; Martinez, K; Motala, A; Newberry, S J; Noble, M; Pfoh, E; Ranji, S R; Rennke, S; Schmidt, E; Shanman, R; Sullivan, N; Sun, F; Tipton, K; Treadwell, J R; Tsou, A; Vaiana, M E; Weaver, S J; Wilson, R; Winters, B D

    2013-01-01

    OBJECTIVES To review important patient safety practices for evidence of effectiveness, implementation, and adoption. DATA SOURCES Searches of multiple computerized databases, gray literature, and the judgments of a 20-member panel of patient safety stakeholders. REVIEW METHODS The judgments of the stakeholders were used to prioritize patient safety practices for review, and to select which practices received in-depth reviews and which received brief reviews. In-depth reviews consisted of a formal literature search, usually of multiple databases, and included gray literature, where applicable. In-depth reviews assessed practices on the following domains: • How important is the problem? • What is the patient safety practice? • Why should this practice work? • What are the beneficial effects of the practice? • What are the harms of the practice? • How has the practice been implemented, and in what contexts? • Are there any data about costs? • Are there data about the effect of context on effectiveness? We assessed individual studies for risk of bias using tools appropriate to specific study designs. We assessed the strength of evidence of effectiveness using a system developed for this project. Brief reviews had focused literature searches for focused questions. All practices were then summarized on the following domains: scope of the problem, strength of evidence for effectiveness, evidence on potential for harmful unintended consequences, estimate of costs, how much is known about implementation and how difficult the practice is to implement. Stakeholder judgment was then used to identify practices that were "strongly encouraged" for adoption, and those practices that were "encouraged" for adoption. RESULTS From an initial list of over 100 patient safety practices, the stakeholders identified 41 practices as a priority for this review: 18 in-depth reviews and 23 brief reviews. Of these, 20 practices had their strength of evidence of effectiveness rated as at least "moderate," and 25 practices had at least "moderate" evidence of how to implement them. Ten practices were classified by the stakeholders as having sufficient evidence of effectiveness and implementation and should be "strongly encouraged" for adoption, and an additional 12 practices were classified as those that should be "encouraged" for adoption. CONCLUSIONS The evidence supporting the effectiveness of many patient safety practices has improved substantially over the past decade. Evidence about implementation and context has also improved, but continues to lag behind evidence of effectiveness. Twenty-two patient safety practices are sufficiently well understood, and health care providers can consider adopting them now. PMID:24423049

  1. Dynamic analysis of practical blades with shear center effect

    NASA Astrophysics Data System (ADS)

    Karadaǧ, V.

    1984-02-01

    Shear center effects on the natural frequencies and mode shapes of rotating and non-rotating practical blades are considered. An 18 degrees of freedom thick beam finite element is developed. Bending and shear force displacements and slopes, and torsional displacements are taken as degrees of freedom at both ends of the element. Total blade deflection slopes are considered as composed of bending and shear force deflection slopes in calculations of blade strain and kinetic energy. This element is compared with the existing thin and thick beam finite elements, and theoretical models. Results obtained for the vibration characteristics of rotating and non-rotating non-uniform aerofoil cross-sectioned blades are compared with the available calculated and experimental values. In all cases considered the element exhibits good convergence characteristics and produces accurate results.

  2. Thermal Analysis Methods For Earth Entry Vehicle

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.; Dec, John A.; Lindell, Michael C.

    2000-01-01

    Thermal analysis of a vehicle designed to return samples from another planet, such as the Earth Entry vehicle for the Mars Sample Return mission, presents several unique challenges. The Earth Entry Vehicle (EEV) must contain Martian material samples after they have been collected and protect them from the high heating rates of entry into the Earth's atmosphere. This requirement necessitates inclusion of detailed thermal analysis early in the design of the vehicle. This paper will describe the challenges and solutions for a preliminary thermal analysis of an Earth Entry Vehicle. The aeroheating on the vehicle during entry would be the main driver for the thermal behavior, and is a complex function of time, spatial position on the vehicle, vehicle temperature, and trajectory parameters. Thus, the thermal analysis must be closely tied to the aeroheating analysis in order to make accurate predictions. Also, the thermal analysis must account for the material response of the ablative thermal protection system (TPS). For the exo-atmospheric portion of the mission, the thermal analysis must include the orbital radiation fluxes on the surfaces. The thermal behavior must also be used to predict the structural response of the vehicle (the thermal stress and strains) and whether they remain within the capability of the materials. Thus, the thermal analysis requires ties to the three-dimensional geometry, the aeroheating analysis, the material response analysis, the orbital analysis, and the structural analysis. The goal of this paper is to describe to what degree that has been achieved.

  3. A Comparison of Low and High Structure Practice for Learning Interactional Analysis Skills

    ERIC Educational Resources Information Center

    Davis, Matthew James

    2011-01-01

    Innovative training approaches in work domains such as professional athletics, aviation, and the military have shown that specific types of practice can reliably lead to higher levels of performance for the average professional. This study describes the development of an initial effort toward creating a similar practice method for psychotherapy…

  4. EMQN Best Practice Guidelines for molecular and haematology methods for carrier identification and prenatal diagnosis of the haemoglobinopathies.

    PubMed

    Traeger-Synodinos, Joanne; Harteveld, Cornelis L; Old, John M; Petrou, Mary; Galanello, Renzo; Giordano, Piero; Angastioniotis, Michael; De la Salle, Barbara; Henderson, Shirley; May, Alison

    2015-04-01

    Haemoglobinopathies constitute the commonest recessive monogenic disorders worldwide, and the treatment of affected individuals presents a substantial global disease burden. Carrier identification and prenatal diagnosis represent valuable procedures that identify couples at risk for having affected children, so that they can be offered options to have healthy offspring. Molecular diagnosis facilitates prenatal diagnosis and definitive diagnosis of carriers and patients (especially 'atypical' cases who often have complex genotype interactions). However, the haemoglobin disorders are unique among all genetic diseases in that identification of carriers is preferable by haematological (biochemical) tests rather than DNA analysis. These Best Practice guidelines offer an overview of recommended strategies and methods for carrier identification and prenatal diagnosis of haemoglobinopathies, and emphasize the importance of appropriately applying and interpreting haematological tests in supporting the optimum application and evaluation of globin gene DNA analysis. PMID:25052315

  5. EMQN Best Practice Guidelines for molecular and haematology methods for carrier identification and prenatal diagnosis of the haemoglobinopathies

    PubMed Central

    Traeger-Synodinos, Joanne; Harteveld, Cornelis L; Old, John M; Petrou, Mary; Galanello, Renzo; Giordano, Piero; Angastioniotis, Michael; De la Salle, Barbara; Henderson, Shirley; May, Alison

    2015-01-01

    Haemoglobinopathies constitute the commonest recessive monogenic disorders worldwide, and the treatment of affected individuals presents a substantial global disease burden. Carrier identification and prenatal diagnosis represent valuable procedures that identify couples at risk for having affected children, so that they can be offered options to have healthy offspring. Molecular diagnosis facilitates prenatal diagnosis and definitive diagnosis of carriers and patients (especially ‘atypical' cases who often have complex genotype interactions). However, the haemoglobin disorders are unique among all genetic diseases in that identification of carriers is preferable by haematological (biochemical) tests rather than DNA analysis. These Best Practice guidelines offer an overview of recommended strategies and methods for carrier identification and prenatal diagnosis of haemoglobinopathies, and emphasize the importance of appropriately applying and interpreting haematological tests in supporting the optimum application and evaluation of globin gene DNA analysis. PMID:25052315

  6. Practical method using superposition of individual magnetic fields for initial arrangement of undulator magnets

    NASA Astrophysics Data System (ADS)

    Tsuchiya, K.; Shioya, T.

    2015-04-01

    We have developed a practical method for determining an excellent initial arrangement of magnetic arrays for a pure-magnet Halbach-type undulator. In this method, the longitudinal magnetic field distribution of each magnet is measured using a moving Hall probe system along the beam axis with a high positional resolution. The initial arrangement of magnetic arrays is optimized and selected by analyzing the superposition of all distribution data in order to achieve adequate spectral quality for the undulator. We applied this method to two elliptically polarizing undulators (EPUs), called U#16-2 and U#02-2, at the Photon Factory storage ring (PF ring) in the High Energy Accelerator Research Organization (KEK). The measured field distribution of the undulator was demonstrated to be excellent for the initial arrangement of the magnet array, and this method saved a great deal of effort in adjusting the magnetic fields of EPUs.

  7. Practical method using superposition of individual magnetic fields for initial arrangement of undulator magnets

    SciTech Connect

    Tsuchiya, K.; Shioya, T.

    2015-04-15

    We have developed a practical method for determining an excellent initial arrangement of magnetic arrays for a pure-magnet Halbach-type undulator. In this method, the longitudinal magnetic field distribution of each magnet is measured using a moving Hall probe system along the beam axis with a high positional resolution. The initial arrangement of magnetic arrays is optimized and selected by analyzing the superposition of all distribution data in order to achieve adequate spectral quality for the undulator. We applied this method to two elliptically polarizing undulators (EPUs), called U#16-2 and U#02-2, at the Photon Factory storage ring (PF ring) in the High Energy Accelerator Research Organization (KEK). The measured field distribution of the undulator was demonstrated to be excellent for the initial arrangement of the magnet array, and this method saved a great deal of effort in adjusting the magnetic fields of EPUs.

  8. Evaluating Physician Impact Analysis: Methods, Results, and Uses in Ontario Hospitals.

    ERIC Educational Resources Information Center

    Charles, Cathy; Roberts, Jacqueline

    1994-01-01

    Physician impact analysis (PIA) is a planning tool intended to provide a way to evaluate the impact of a new or replacement physician's practice profile on the clinical program priorities, staffing resources, and costs of a hospital. Key methods for PIA and issues related to its use are considered. (SLD)

  9. Exploring the Current Landscape of Intravenous Infusion Practices and Errors (ECLIPSE): protocol for a mixed-methods observational study

    PubMed Central

    Blandford, Ann; Furniss, Dominic; Chumbley, Gill; Iacovides, Ioanna; Wei, Li; Cox, Anna; Mayer, Astrid; Schnock, Kumiko; Bates, David Westfall; Dykes, Patricia C; Bell, Helen; Dean Franklin, Bryony

    2016-01-01

    Introduction Intravenous medication is essential for many hospital inpatients. However, providing intravenous therapy is complex and errors are common. ‘Smart pumps’ incorporating dose error reduction software have been widely advocated to reduce error. However, little is known about their effect on patient safety, how they are used or their likely impact. This study will explore the landscape of intravenous medication infusion practices and errors in English hospitals and how smart pumps may relate to the prevalence of medication administration errors. Methods and analysis This is a mixed-methods study involving an observational quantitative point prevalence study to determine the frequency and types of errors that occur in the infusion of intravenous medication, and qualitative interviews with hospital staff to better understand infusion practices and the contexts in which errors occur. The study will involve 5 clinical areas (critical care, general medicine, general surgery, paediatrics and oncology), across 14 purposively sampled acute hospitals and 2 paediatric hospitals to cover a range of intravenous infusion practices. Data collectors will compare each infusion running at the time of data collection against the patient's medication orders to identify any discrepancies. The potential clinical importance of errors will be assessed. Quantitative data will be analysed descriptively; interviews will be analysed using thematic analysis. Ethics and dissemination Ethical approval has been obtained from an NHS Research Ethics Committee (14/SC/0290); local approvals will be sought from each participating organisation. Findings will be published in peer-reviewed journals and presented at conferences for academic and health professional audiences. Results will also be fed back to participating organisations to inform local policy, training and procurement. Aggregated findings will inform the debate on costs and benefits of the NHS investing in smart pump technology, and what other changes may need to be made to ensure effectiveness of such an investment. PMID:26940104

  10. A practical method to avoid zero-point leak in molecular dynamics calculations: Application to the water dimer

    NASA Astrophysics Data System (ADS)

    Czakó, Gábor; Kaledin, Alexey L.; Bowman, Joel M.

    2010-04-01

    We report the implementation of a previously suggested method to constrain a molecular system to have mode-specific vibrational energy greater than or equal to the zero-point energy in quasiclassical trajectory calculations [J. M. Bowman et al., J. Chem. Phys. 91, 2859 (1989); W. H. Miller et al., J. Chem. Phys. 91, 2863 (1989)]. The implementation is made practical by using a technique described recently [G. Czakó and J. M. Bowman, J. Chem. Phys. 131, 244302 (2009)], where a normal-mode analysis is performed during the course of a trajectory and which gives only real-valued frequencies. The method is applied to the water dimer, where its effectiveness is shown by computing mode energies as a function of integration time. Radial distribution functions are also calculated using constrained quasiclassical and standard classical molecular dynamics at low temperature and at 300 K and compared to rigorous quantum path integral calculations.

  11. Three-dimensional Stress Analysis Using the Boundary Element Method

    NASA Technical Reports Server (NTRS)

    Wilson, R. B.; Banerjee, P. K.

    1984-01-01

    The boundary element method is to be extended (as part of the NASA Inelastic Analysis Methods program) to the three-dimensional stress analysis of gas turbine engine hot section components. The analytical basis of the method (as developed in elasticity) is outlined, its numerical implementation is summarized, and the approaches to be followed in extending the method to include inelastic material response indicated.

  12. Strategic planning for public health practice using macroenvironmental analysis.

    PubMed Central

    Ginter, P M; Duncan, W J; Capper, S A

    1991-01-01

    Macroenvironmental analysis is the initial stage in comprehensive strategic planning. The authors examine the benefits of this type of analysis when applied to public health organizations and present a series of questions that should be answered prior to committing resources to scanning, monitoring, forecasting, and assessing components of the macroenvironment. Using illustrations from the public and private sectors, each question is examined with reference to specific challenges facing public health. Benefits are derived both from the process and the outcome of macroenvironmental analysis. Not only are data acquired that assist public health professionals to make decisions, but the analytical process required assures a better understanding of potential external threats and opportunities as well as an organization's strengths and weaknesses. Although differences exist among private and public as well as profit and not-for-profit organizations, macroenvironmental analysis is seen as more essential to the public and not-for-profit sectors than the private and profit sectors. This conclusion results from the extreme dependency of those areas on external environmental forces that cannot be significantly influenced or controlled by public health decision makers. PMID:1902305

  13. An Analysis of Ethical Considerations in Programme Design Practice

    ERIC Educational Resources Information Center

    Govers, Elly

    2014-01-01

    Ethical considerations are inherent to programme design decision-making, but not normally explicit. Nonetheless, they influence whose interests are served in a programme and who benefits from it. This paper presents an analysis of ethical considerations made by programme design practitioners in the context of a polytechnic in Aotearoa/New Zealand.…

  14. Suspension, Race, and Disability: Analysis of Statewide Practices and Reporting

    ERIC Educational Resources Information Center

    Krezmien, Michael P.; Leone, Peter E.; Achilles, Georgianna M.

    2006-01-01

    This analysis of statewide suspension data from 1995 to 2003 in Maryland investigated disproportionate suspensions of minority students and students with disabilities. We found substantial increases in over-all rates of suspensions from 1995 to 2003, as well as disproportionate rates of suspensions for African American students, American Indian…

  15. Analysis and Practices of Teaching: Desciption of a Course.

    ERIC Educational Resources Information Center

    Etheridge, Carol Plata; And Others

    An introductory teacher preparation course based on Adler's Paideia concepts was examined for documentation of course content, purposes, and student reactions. Data were collected through ethnographic observations of course classes, interviews with students and professors, and examination of readings for the course. The course, "Analysis and…

  16. Newborn Hearing Screening: An Analysis of Current Practices

    ERIC Educational Resources Information Center

    Houston, K. Todd; Bradham, Tamala S.; Munoz, Karen F.; Guignard, Gayla Hutsell

    2011-01-01

    State coordinators of early hearing detection and intervention (EHDI) programs completed a strengths, weaknesses, opportunities, and threats, or SWOT, analysis that consisted of 12 evaluative areas of EHDI programs. For the newborn hearing screening area, a total of 293 items were listed by 49 EHDI coordinators, and themes were identified within…

  17. The Analysis of Athletic Performance: Some Practical and Philosophical Considerations

    ERIC Educational Resources Information Center

    Nelson, Lee J.; Groom, Ryan

    2012-01-01

    This article presents a hypothetical dialogue between a notational analyst (NA) recently schooled in the positivistic assessment of athletic performance, an "old-school" traditional coach (TC) who favours subjective analysis, and a pragmatic educator (PE). The conversation opens with NA and TC debating the respective value of quantitative and…

  18. Visceral fat estimation method by bioelectrical impedance analysis and causal analysis

    NASA Astrophysics Data System (ADS)

    Nakajima, Hiroshi; Tasaki, Hiroshi; Tsuchiya, Naoki; Hamaguchi, Takehiro; Shiga, Toshikazu

    2011-06-01

    It has been clarified that abdominal visceral fat accumulation is closely associated to the lifestyle disease and metabolic syndrome. The gold standard in medical fields is visceral fat area measured by an X-ray computer tomography (CT) scan or magnetic resonance imaging. However, their measurements are high invasive and high cost; especially a CT scan causes X-ray exposure. They are the reasons why medical fields need an instrument for viscera fat measurement with low invasive, ease of use, and low cost. The article proposes a simple and practical method of visceral fat estimation by employing bioelectrical impedance analysis and causal analysis. In the method, abdominal shape and dual impedances of abdominal surface and body total are measured to estimate a visceral fat area based on the cause-effect structure. The structure is designed according to the nature of abdominal body composition to be fine-tuned by statistical analysis. The experiments were conducted to investigate the proposed model. 180 subjects were hired to be measured by both a CT scan and the proposed method. The acquired model explained the measurement principle well and the correlation coefficient is 0.88 with the CT scan measurements.

  19. Adaptation of Cost Analysis Studies in Practice Guidelines.

    PubMed

    Zervou, Fainareti N; Zacharioudakis, Ioannis M; Pliakos, Elina Eleftheria; Grigoras, Christos A; Ziakas, Panayiotis D; Mylonakis, Eleftherios

    2015-12-01

    Clinical guidelines play a central role in day-to-day practice. We assessed the degree of incorporation of cost analyses to guidelines and identified modifiable characteristics that could affect the level of incorporation.We selected the 100 most cited guidelines listed on the National Guideline Clearinghouse (http://www.guideline.gov) and determined the number of guidelines that used cost analyses in their reasoning and the overall percentage of incorporation of relevant cost analyses available in PubMed. Differences between medical specialties were also studied. Then, we performed a case-control study using incorporated and not incorporated cost analyses after 1:1 matching by study subject and compared them by the Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement requirements and other criteria.We found that 57% of guidelines do not use any cost justification. Guidelines incorporate a weighted average of 6.0% (95% confidence interval [CI] 4.3-7.9) among 3396 available cost analyses, with cardiology and infectious diseases guidelines incorporating 10.8% (95% CI 5.3-18.1) and 9.9% (95% CI 3.9- 18.2), respectively, and hematology/oncology and urology guidelines incorporating 4.5% (95% CI 1.6-8.6) and 1.6% (95% CI 0.4-3.5), respectively. Based on the CHEERS requirements, the mean number of items reported by the 148 incorporated cost analyses was 18.6 (SD = 3.7), a small but significant difference over controls (17.8 items; P = 0.02). Included analyses were also more likely to directly relate cost reductions to healthcare outcomes (92.6% vs 81.1%, P = 0.004) and declare the funding source (72.3% vs 53.4%, P < 0.001), while similar number of cases and controls reported a noncommercial funding source (71% vs 72.7%; P = 0.8).Guidelines remain an underused mechanism for the cost-effective allocation of available resources and a minority of practice guidelines incorporates cost analyses utilizing only 6% of the available cost analyses. Fulfilling the CHEERS requirements, directly relating costs with healthcare outcomes and transparently declaring the funding source seem to be valued by guideline-writing committees. PMID:26717377

  20. Screening Workers: An Examination and Analysis of Practice and Public Policy.

    ERIC Educational Resources Information Center

    Greenfield, Patricia A.; And Others

    1989-01-01

    Discusses methods of screening job applicants and issues raised by screening procedures.. Includes legal ramifications, current practices in Britain and the United States, future directions, and the employment interview. (JOW)

  1. Methods for analysis of fluoroquinolones in biological fluids

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Methods for analysis of 10 selected fluoroquinolone antibiotics in biological fluids are reviewed. Approaches for sample preparation, detection methods, limits of detection and quantitation and recovery information are provided for both single analyte and multi-analyte fluoroquinolone methods....

  2. Effect of practice management softwares among physicians of developing countries with special reference to Indian scenario by Mixed Method Technique

    PubMed Central

    Davey, Sanjeev; Davey, Anuradha

    2015-01-01

    Introduction: Currently, many cheaper “practice management software” (PMS) are available in developing countries including India; despite their availability and benefits, its penetration and usage vary from low to moderate level, justifying the importance of this study area. Materials and Methods: First preferred reporting items for systematic-review and meta-analysis (2009) guidelines were considered; followed by an extensive systematic-review of available studies in literature related to developing countries, on key search term from main abstracting databases: PubMed, EMBASE, EBSCO, BIO-MED Central, Cochrane Library, world CAT-library till 15 June 2014; where any kind of article whether published or unpublished, in any sort or form or any language indicating the software usage were included. Thereafter, meta-analysis on Indian studies revealing the magnitude of usage in Indian scenario by Open Meta-(analyst) software using binary random effects (REs) model was done. Studies from developed countries were excluded in our study. Results: Of 57 studies included in a systematic review from developing countries, only 4 Indian studies were found eligible for meta-analysis. RE model revealed although not-significant results (total participants = 243,526; range: 100–226,228, overall odds ratio = 2.85, 95% confidence interval = P < 0.05 and tests for heterogeneity: Q [df = 3] = 0.8 Het. P = 0.85). The overall magnitude of usage of PMS on Indian physicians practice was however found between 10% and 45%. Conclusion: Although variable and nonsignificant effect of usage of PM software on practice of physicians in developing countries like India was found; there is a need to recognize the hidden potential of this system. Hence, more in-depth research in future needs to be done, in order to find a real impact of this system. PMID:25949969

  3. Integration of Formal Job Hazard Analysis & ALARA Work Practice

    SciTech Connect

    NELSEN, D.P.

    2002-09-01

    ALARA work practices have traditionally centered on reducing radiological exposure and controlling contamination. As such, ALARA policies and procedures are not well suited to a wide range of chemical and human health issues. Assessing relative risk, identifying appropriate engineering/administrative controls and selecting proper Personal Protective Equipment (PPE) for non nuclear work activities extends beyond the limitations of traditional ALARA programs. Forging a comprehensive safety management program in today's (2002) work environment requires a disciplined dialog between health and safety professionals (e.g. safety, engineering, environmental, quality assurance, industrial hygiene, ALARA, etc.) and personnel working in the field. Integrating organizational priorities, maintaining effective pre-planning of work and supporting a team-based approach to safety management represents today's hallmark of safety excellence. Relying on the mandates of any single safety program does not provide industrial hygiene with the tools necessary to implement an integrated safety program. The establishment of tools and processes capable of sustaining a comprehensive safety program represents a key responsibility of industrial hygiene. Fluor Hanford has built integrated safety management around three programmatic attributes: (1) Integration of radiological, chemical and ergonomic issues under a single program. (2) Continuous improvement in routine communications among work planning/scheduling, job execution and management. (3) Rapid response to changing work conditions, formalized work planning and integrated worker involvement.

  4. Benthic macroinvertebrates in lake ecological assessment: A review of methods, intercalibration and practical recommendations.

    PubMed

    Poikane, Sandra; Johnson, Richard K; Sandin, Leonard; Schartau, Ann Kristin; Solimini, Angelo G; Urbanič, Gorazd; Arbačiauskas, Kęstutis; Aroviita, Jukka; Gabriels, Wim; Miler, Oliver; Pusch, Martin T; Timm, Henn; Böhmer, Jürgen

    2016-02-01

    Legislation in Europe has been adopted to determine and improve the ecological integrity of inland and coastal waters. Assessment is based on four biotic groups, including benthic macroinvertebrate communities. For lakes, benthic invertebrates have been recognized as one of the most difficult organism groups to use in ecological assessment, and hitherto their use in ecological assessment has been limited. In this study, we review and intercalibrate 13 benthic invertebrate-based tools across Europe. These assessment tools address different human impacts: acidification (3 methods), eutrophication (3 methods), morphological alterations (2 methods), and a combination of the last two (5 methods). For intercalibration, the methods were grouped into four intercalibration groups, according to the habitat sampled and putative pressure. Boundaries of the 'good ecological status' were compared and harmonized using direct or indirect comparison approaches. To enable indirect comparison of the methods, three common pressure indices and two common biological multimetric indices were developed for larger geographical areas. Additionally, we identified the best-performing methods based on their responsiveness to different human impacts. Based on these experiences, we provide practical recommendations for the development and harmonization of benthic invertebrate assessment methods in lakes and similar habitats. PMID:26580734

  5. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  6. [An analysis of professional practice as a means to enhance one's skills: theoretical basis, actual process and associated limitations].

    PubMed

    Lagadec, Anne Marie

    2009-06-01

    With the advent of skill-based approaches in the training of health professionals, trainers have been encouraged to make use of a methodology which analyses actual practice. This paper's main objective is to take a deeper look, from a theoretical perspective, at the process of analysing professional practice. This particular training method, which has been in use for a considerable amount of time in certain professional circles, has two "historical" sources: Balint Groups and Schön's Reflective Practice. I intend to concentrate on the latter, Reflective Practice approaches. These involve a group of peers working with a facilitator, whose objective is to consider the various work contexts associated with the participants. As a result, they are involved with a "hindsight approach", so that real work situations, the way in which practitioners have carried out a contextual analysis, how this has been interpreted and the strategies which have been put in place can be understood. So, what process do these approaches enable as regards developing professional skills? How does this evaluation of practice challenge knowledge, experience and representations? Finally, if this approach indeed facilitates a change in practice, what are the constraints? PMID:19642474

  7. Methods for Mediation Analysis with Missing Data

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Wang, Lijuan

    2013-01-01

    Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum…

  8. Methods for Mediation Analysis with Missing Data

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Wang, Lijuan

    2013-01-01

    Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum

  9. The uniform asymptotic swallowtail approximation - Practical methods for oscillating integrals with four coalescing saddle points

    NASA Technical Reports Server (NTRS)

    Connor, J. N. L.; Curtis, P. R.; Farrelly, D.

    1984-01-01

    Methods that can be used in the numerical implementation of the uniform swallowtail approximation are described. An explicit expression for that approximation is presented to the lowest order, showing that there are three problems which must be overcome in practice before the approximation can be applied to any given problem. It is shown that a recently developed quadrature method can be used for the accurate numerical evaluation of the swallowtail canonical integral and its partial derivatives. Isometric plots of these are presented to illustrate some of their properties. The problem of obtaining the arguments of the swallowtail integral from an analytical function of its argument is considered, describing two methods of solving this problem. The asymptotic evaluation of the butterfly canonical integral is addressed.

  10. Degradation of learned skills. Effectiveness of practice methods on simulated space flight skill retention

    NASA Technical Reports Server (NTRS)

    Sitterley, T. E.; Berge, W. A.

    1972-01-01

    Manual flight control and emergency procedure task skill degradation was evaluated after time intervals of from 1 to 6 months. The tasks were associated with a simulated launch through the orbit insertion flight phase of a space vehicle. The results showed that acceptable flight control performance was retained for 2 months, rapidly deteriorating thereafter by a factor of 1.7 to 3.1 depending on the performance measure used. Procedural task performance showed unacceptable degradation after only 1 month, and exceeded an order of magnitude after 4 months. The effectiveness of static rehearsal (checklists and briefings) and dynamic warmup (simulator practice) retraining methods were compared for the two tasks. Static rehearsal effectively countered procedural skill degradation, while some combination of dynamic warmup appeared necessary for flight control skill retention. It was apparent that these differences between methods were not solely a function of task type or retraining method, but were a function of the performance measures used for each task.

  11. New method of analysis of crystallizer temperature profile using optical fiber DTS

    NASA Astrophysics Data System (ADS)

    Koudelka, Petr; Pápeš, Martin; Líner, Andrej; Látal, Jan; Šiška, Petr; Vašinek, Vladimír.

    2012-01-01

    Continuous casting is a modern and advanced technology of steel production, which product is a blank as an intermediate product for further processing. One of the most important parts of this whole process is crystallizer. At present most of methods, describing how to analyze the temperature profile of crystallizer in operation, were published and experimentally verified. These methods include the use of thermocouples or Bragg's grids. New sophisticated method of analysis of crystallizer temperature profile is the use of optical fiber DTS based on stimulated Raman dispersion. This paper contains the first experimental measurement and method's verification, which are necessary for the deployment this method into industrial practice.

  12. Practical notes on local data-worth analysis

    NASA Astrophysics Data System (ADS)

    Finsterle, Stefan

    2015-12-01

    These notes discuss the usefulness, limitations, and potential pitfalls of using sensitivity indices as a means to evaluate data worth and to guide the formulation and solution of inverse problems. A sensitivity analysis examines changes in model output variables with respect to changes in model input parameters. It appears straightforward to use this information to select influential parameters that should be subjected to estimation by inverse modeling and to identify the observations that contain information about these parameters and thus may be useful as calibration points. However, the results of such a sensitivity analysis do not account for parameter correlations and redundancies in observations and may not properly separate between calibration and prediction targets if used as criteria that guide inverse modeling; they may thus yield misleading recommendations about parameter identifiability and data worth. These issues (and some remedies) are discussed using an illustrative example, in which we examine the value of data sets potentially used for the calibration of a geothermal reservoir model. These notes highlight the importance of carefully formulating the objectives of a simulation study, which controls the setup of the inverse problem and related data needs.

  13. Practical Application of Parallel Coordinates for Climate Model Analysis

    SciTech Connect

    Steed, Chad A; Shipman, Galen M; Thornton, Peter E; Ricciuto, Daniel M; Erickson III, David J; Branstetter, Marcia L

    2012-01-01

    The determination of relationships between climate variables and the identification of the most significant associations between them in various geographic regions is an important aspect of climate model evaluation. The EDEN visual analytics toolkit has been developed to aid such analysis by facilitating the assessment of multiple variables with respect to the amount of variability that can be attributed to specific other variables. EDEN harnesses the parallel coordinates visualization technique and is augmented with graphical indicators of key descriptive statistics. A case study is presented in which the focus on the Harvard Forest site (42.5378N Lat, 72.1715W Lon) and the Community Land Model Version 4 (CLM4) is evaluated. It is shown that model variables such as land water runoff are more sensitive to a particular set of environmental variables than a suite of other inputs in the 88 variable analysis conducted. The approach presented here allows climate-domain scientists to focus on the most important variables in the model evaluations.

  14. Image, measure, figure: a critical discourse analysis of nursing practices that develop children.

    PubMed

    Einboden, Rochelle; Rudge, Trudy; Varcoe, Colleen

    2013-07-01

    Motivated by discourses that link early child development and health, nurses engage in seemingly benign surveillance of children. These practices are based on knowledge claims and technologies of developmental science, which remain anchored in assumptions of the child body as an incomplete form with a universal developmental trajectory and inherent potentiality. This paper engages in a critical discursive analysis, drawing on Donna Haraway's conceptualizations of technoscience and figuration. Using a contemporary developmental screening tool from nursing practice, this analysis traces the effects of this tool through production, transformation, distribution, and consumption. It reveals how the techniques of imaging, abstraction, and measurement collide to fix the open, transformative child body in a figuration of the developing child. This analysis also demonstrates how technobiopower infuses nurses' understandings of children and structures developmentally appropriate expectations for children, parents, and nurses. Furthermore, it describes how practices that claim to facilitate healthy child development may inversely deprive children of agency and foster the production of normal or ideal children. An alternative ontological perspective is offered as a challenge to the individualism of developmental models and other dominant ideologies of development, as well as practices associated with these ideologies. In summary, this analysis argues that nurses must pay closer attention to how technobiopower infuses practices that monitor and promote child development. Fostering a critical understanding of the harmful implications of these practices is warranted and offers the space to conceive of human development in alternate and exciting ways. PMID:23745662

  15. METHODS FOR SAMPLING AND ANALYSIS OF BREATH

    EPA Science Inventory

    The research program surveyed and evaluated the methods and procedures used to identify and quantitate chemical constituents in human breath. Methods have been evaluated to determine their ease and rapidity, as well as cost, accuracy, and precision. During the evaluation, a secon...

  16. PIC (PRODUCTS OF INCOMPLETE COMBUSTION) ANALYSIS METHODS

    EPA Science Inventory

    The report gives results of method evaluations for products of incomplete combustion (PICs): 36 proposed PICs were evaluated by previously developed gas chromatography/flame ionization detection (GC/FID) and gas chromatography/mass spectroscopy (GC/MS) methods. It also gives resu...

  17. A catalog of automated analysis methods for enterprise models.

    PubMed

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool. PMID:27047732

  18. Meta-research: Evaluation and Improvement of Research Methods and Practices.

    PubMed

    Ioannidis, John P A; Fanelli, Daniele; Dunne, Debbie Drake; Goodman, Steven N

    2015-10-01

    As the scientific enterprise has grown in size and diversity, we need empirical evidence on the research process to test and apply interventions that make it more efficient and its results more reliable. Meta-research is an evolving scientific discipline that aims to evaluate and improve research practices. It includes thematic areas of methods, reporting, reproducibility, evaluation, and incentives (how to do, report, verify, correct, and reward science). Much work is already done in this growing field, but efforts to-date are fragmented. We provide a map of ongoing efforts and discuss plans for connecting the multiple meta-research efforts across science worldwide. PMID:26431313

  19. A practical adaptive-grid method for complex fluid-flow problems

    NASA Technical Reports Server (NTRS)

    Nakahashi, K.; Deiwert, G. S.

    1984-01-01

    A practical solution, adaptive-grid method utilizing a tension and torsion spring analogy is proposed for multidimensional fluid flow problems. The tension spring, which connects adjacent grid points to each other, controls grid spacings. The torsion spring, which is attached to each grid node, controls inclinations of coordinate lines and grid skewness. A marching procedure was used that results in a simple tridiagonal system of equations at each coordinate line to determine grid-point distribution. Multidirectional adaptation is achieved by successive applications of one-dimensional adaptation. Examples of applications for axisymmetric afterbody flow fields and two dimensional transonic airfoil flow fields are shown.

  20. Meta-research: Evaluation and Improvement of Research Methods and Practices

    PubMed Central

    Ioannidis, John P. A.; Fanelli, Daniele; Dunne, Debbie Drake; Goodman, Steven N.

    2015-01-01

    As the scientific enterprise has grown in size and diversity, we need empirical evidence on the research process to test and apply interventions that make it more efficient and its results more reliable. Meta-research is an evolving scientific discipline that aims to evaluate and improve research practices. It includes thematic areas of methods, reporting, reproducibility, evaluation, and incentives (how to do, report, verify, correct, and reward science). Much work is already done in this growing field, but efforts to-date are fragmented. We provide a map of ongoing efforts and discuss plans for connecting the multiple meta-research efforts across science worldwide. PMID:26431313