Science.gov

Sample records for practical analysis method

  1. A practical exergy method for system analysis

    SciTech Connect

    Feng, X.; Zhu, X.X.; Zheng, J.P.

    1996-12-31

    Conventional exergy analysis can only provide information about the potential or possibilities of improving performance of processes, but cannot state whether or not the possible improvement is practicable and economic. In this paper, a new method is proposed based on better understanding of process performance and improvement. This is achieved by dividing exergy losses into avoidable and inevitable exergy losses. The inevitable exergy losses are defined as the minimum exergy losses which can not be avoided for a process to take place in both technical and economic terms. If the inevitable exergy losses can be determined, the attention can only be paid to the avoidable exergy losses which can be converted into useful work. According to this improved understanding, a new practical exergy efficiency is defined and a new graphic representation is proposed. An example about a steam power plant is given using the new method which shows the advantages over the conventional method.

  2. A practical method for the analysis of meteor spectra

    E-print Network

    Dubs, Martin

    2015-01-01

    The analysis of meteor spectra (photographic, CCD or video recording) is complicated by the fact that spectra obtained with objective gratings are curved and have a nonlinear dispersion. In this paper it is shown that with a simple image transformation the spectra can be linearized in such a way that individual spectra over the whole image plane are parallel and have a constant, linear dispersion. This simplifies the identification and measurement of meteor spectral lines. A practical method is given to determine the required image transformation.

  3. A topography analysis incorporated optimization method for the selection and placement of best management practices.

    PubMed

    Shen, Zhenyao; Chen, Lei; Xu, Liang

    2013-01-01

    Best Management Practices (BMPs) are one of the most effective methods to control nonpoint source (NPS) pollution at a watershed scale. In this paper, the use of a topography analysis incorporated optimization method (TAIOM) was proposed, which integrates topography analysis with cost-effective optimization. The surface status, slope and the type of land use were evaluated as inputs for the optimization engine. A genetic algorithm program was coded to obtain the final optimization. The TAIOM was validated in conjunction with the Soil and Water Assessment Tool (SWAT) in the Yulin watershed in Southwestern China. The results showed that the TAIOM was more cost-effective than traditional optimization methods. The distribution of selected BMPs throughout landscapes comprising relatively flat plains and gentle slopes, suggests the need for a more operationally effective scheme, such as the TAIOM, to determine the practicability of BMPs before widespread adoption. The TAIOM developed in this study can easily be extended to other watersheds to help decision makers control NPS pollution. PMID:23349917

  4. A Topography Analysis Incorporated Optimization Method for the Selection and Placement of Best Management Practices

    PubMed Central

    Shen, Zhenyao; Chen, Lei; Xu, Liang

    2013-01-01

    Best Management Practices (BMPs) are one of the most effective methods to control nonpoint source (NPS) pollution at a watershed scale. In this paper, the use of a topography analysis incorporated optimization method (TAIOM) was proposed, which integrates topography analysis with cost-effective optimization. The surface status, slope and the type of land use were evaluated as inputs for the optimization engine. A genetic algorithm program was coded to obtain the final optimization. The TAIOM was validated in conjunction with the Soil and Water Assessment Tool (SWAT) in the Yulin watershed in Southwestern China. The results showed that the TAIOM was more cost-effective than traditional optimization methods. The distribution of selected BMPs throughout landscapes comprising relatively flat plains and gentle slopes, suggests the need for a more operationally effective scheme, such as the TAIOM, to determine the practicability of BMPs before widespread adoption. The TAIOM developed in this study can easily be extended to other watersheds to help decision makers control NPS pollution. PMID:23349917

  5. Practical implementation of an accurate method for multilevel design sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.

    1987-01-01

    Solution techniques for handling large scale engineering optimization problems are reviewed. Potentials for practical applications as well as their limited capabilities are discussed. A new solution algorithm for design sensitivity is proposed. The algorithm is based upon the multilevel substructuring concept to be coupled with the adjoint method of sensitivity analysis. There are no approximations involved in the present algorithm except the usual approximations introduced due to the discretization of the finite element model. Results from the six- and thirty-bar planar truss problems show that the proposed multilevel scheme for sensitivity analysis is more effective (in terms of computer incore memory and the total CPU time) than a conventional (one level) scheme even on small problems. The new algorithm is expected to perform better for larger problems and its applications on the new generation of computer hardwares with 'parallel processing' capability is very promising.

  6. Methods and practices used in incident analysis in the Finnish nuclear power industry.

    PubMed

    Suksi, Seija

    2004-07-26

    According to the Finnish Nuclear Energy Act it is licensee's responsibility to ensure safe use of nuclear energy. Radiation and Nuclear Safety Authority (STUK) is the regulatory body responsible for the state supervision of the safe use of nuclear power in Finland. One essential prerequisite for the safe and reliable operation of nuclear power plants is that lessons are learned from the operational experience. It is utility's prime responsibility to assess the operational events and implement appropriate corrective actions. STUK controls licensees' operational experience feedback arrangements and implementation as part of its inspection activities. In addition to this in Finland, the regulatory body performs its own assessment of the operational experience. Review and investigation of operational events is a part of the regulatory oversight of operational safety. Review of operational events is done by STUK basically at three different levels. First step is to perform a general review of all operational events, transients and reactor scram reports, which the licensees submit for information to STUK. The second level activities are related to the clarification of events at site and entering of events' specific data into the event register database of STUK. This is done for events which meet the set criteria for the operator to submit a special report to STUK for approval. Safety significance of operational events is determined using probabilistic safety assessment (PSA) techniques. Risk significance of events and the number of safety significant events are followed by STUK indicators. The final step in operational event assessment performed by STUK is to assign STUK's own investigation team for events deemed to have special importance, especially when the licensee's organisation has not operated as planned. STUK launches its own detail investigation once a year on average. An analysis and evaluation of event investigation methods applied at STUK, and at the two Finnish nuclear power plant operators Teollisuuden Voima Oy (TVO) and Fortum Power and Heat Oy (Fortum) was carried out by the Technical Research Centre (VTT) on request of STUK at the end of 1990s. The study aimed at providing a broad overview and suggestions for improvement of the whole organisational framework to support event investigation practices at the regulatory body and at the utilities. The main objective of the research was to evaluate the adequacy and reliability of event investigation analysis methods and practices in the Finnish nuclear power industry and based on the results to further develop them. The results and suggestions of the research are reviewed in the paper and the corrective actions implemented in event investigation and operating experience procedures both at STUK and at utilities are discussed as well. STUK has developed its own procedure for the risk-informed analysis of nuclear power plant events. The PSA based event analysis method is used to assess the safety significance and importance measures associated with the unavailability of components and systems subject to Technical Specifications. The insights from recently performed PSA based analyses are also briefly discussed in the paper. PMID:15231350

  7. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory?Determination of Trihalomethane Formation Potential, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Fram, Miranda S.; Bush, Noel

    2004-01-01

    An analytical method for the determination of the trihalomethane formation potential of water samples has been developed. The trihalomethane formation potential is measured by dosing samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine, and then analyzing the resulting trihalomethanes by purge and trap/gas chromatography equipped with an electron capture detector. Detailed explanations of the method and quality-control practices are provided. Method validation experiments showed that the trihalomethane formation potential varies as a function of time between sample collection and analysis, residual-free chlorine concentration, method of sample dilution, and the concentration of bromide in the sample.

  8. A Practical Method Of Machine Tool Condition Monitoring By Analysis Of Component Surface Finish Data

    NASA Astrophysics Data System (ADS)

    Hingle, H. T.

    1987-01-01

    Random process analysis of component surface finish data is used to establish the 'fingerprint' of the machine tool condition when applied to a particular machining operation. Vibrations occurring during the machining process can be determined and the nature of the vibration isolated. It will be shown that for a turning operation it is possible to distinguish among types of error, such as, vibrations and errors causing radial movement of the cutting tool, variation in feed rate, vertical vibration of the cutter or component, or worn bearings. Existing methods used for condition monitoring involve the use of expensive vibration analysers with skilled personnel to assess the results and make a judgement of the machine tool capability. This means that a machine must be taken off line to be checked and hence cannot be continually assessed. Random process analysis of the surface texture produced on the component permits condition monitoring and assessment of machine capability to be made during production runs. The control parameters are not based on an arbitary judgement but on maintaining an acceptable quality of component according to its specification. This method effectively closes the control loop closely around the component. It modifies the control parameters to meet the required precision for the component and assesses if the machine capability is acceptable.

  9. Animal Disease Import Risk Analysis--a Review of Current Methods and Practice.

    PubMed

    Peeler, E J; Reese, R A; Thrush, M A

    2015-10-01

    The application of risk analysis to the spread of disease with international trade in animals and their products, that is, import risk analysis (IRA), has been largely driven by the Sanitary and Phytosanitary (SPS) agreement of the World Trade Organization (WTO). The degree to which the IRA standard established by the World Organization for Animal Health (OIE), and associated guidance, meets the needs of the SPS agreement is discussed. The use of scenario trees is the core modelling approach used to represent the steps necessary for the hazard to occur. There is scope to elaborate scenario trees for commodity IRA so that the quantity of hazard at each step is assessed, which is crucial to the likelihood of establishment. The dependence between exposure and establishment suggests that they should fall within the same subcomponent. IRA undertaken for trade reasons must include an assessment of consequences to meet SPS criteria, but guidance is sparse. The integration of epidemiological and economic modelling may open a path for better methods. Matrices have been used in qualitative IRA to combine estimates of entry and exposure, and consequences with likelihood, but this approach has flaws and better methods are needed. OIE IRA standards and guidance indicate that the volume of trade should be taken into account, but offer no detail. Some published qualitative IRAs have assumed current levels and patterns of trade without specifying the volume of trade, which constrains the use of IRA to determine mitigation measures (to reduce risk to an acceptable level) and whether the principle of equivalence, fundamental to the SPS agreement, has been observed. It is questionable whether qualitative IRA can meet all the criteria set out in the SPS agreement. Nevertheless, scope exists to elaborate the current standards and guidance, so they better serve the principle of science-based decision-making. PMID:24237667

  10. Parameterizing land use planning : deploying quantitative analysis methods in the practice of city planning

    E-print Network

    Kaufmann, Talia

    2014-01-01

    Planning a city is a complex task. In particular, the practice of land use planning, which determines the quantities and locations of land uses we find in a city, is a highly complex process. Planners, developers and ...

  11. A Practical Escape and Effect Analysis for Building Lightweight Method Summaries

    E-print Network

    Rugina, Radu

    of method summaries, thus avoiding the costly inter-procedural com- putations or imprecise assumptions, at the expense of requiring method summaries to be provided from an external source. Summaries can be either to statically enforce a desired side-effect discipline, or for program understanding purposes. This paper makes

  12. Practical Thermal Evaluation Methods For HAC Fire Analysis In Type B Radiaoactive Material (RAM) Packages

    SciTech Connect

    Abramczyk, Glenn; Hensel, Stephen J; Gupta, Narendra K.

    2013-03-28

    Title 10 of the United States Code of Federal Regulations Part 71 for the Nuclear Regulatory Commission (10 CFR Part 71.73) requires that Type B radioactive material (RAM) packages satisfy certain Hypothetical Accident Conditions (HAC) thermal design requirements to ensure package safety during accidental fire conditions. Compliance with thermal design requirements can be met by prototype tests, analyses only or a combination of tests and analyses. Normally, it is impractical to meet all the HAC using tests only and the analytical methods are too complex due to the multi-physics non-linear nature of the fire event. Therefore, a combination of tests and thermal analyses methods using commercial heat transfer software are used to meet the necessary design requirements. The authors, along with his other colleagues at Savannah River National Laboratory in Aiken, SC, USA, have successfully used this 'tests and analyses' approach in the design and certification of several United States' DOE/NNSA certified packages, e.g. 9975, 9977, 9978, 9979, H1700, and Bulk Tritium Shipping Package (BTSP). This paper will describe these methods and it is hoped that the RAM Type B package designers and analysts can use them for their applications.

  13. A practical method for assessing cadmium levels in soil using the DTPA extraction technique with graphite furnace analysis

    SciTech Connect

    Bailey, V.L.; Grant, C.A.; Bailey, L.D.

    1995-09-01

    Using the DTPA extraction procedure and a graphite furnace atomic absorption spectrophotometer, a practical method for determining soil cadmium levels was developed. Furnace parameters, instrument parameters, solvent dilution factor, and solvent characteristics were determined using experimental field samples and standardized control samples. The DTPA extraction method gave reproducible results and removed approximately 20 to 60% of total soil cadmium. 14 refs., 5 tabs.

  14. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: practical guidelines for the crystalline sponge method

    SciTech Connect

    Ramadhar, Timothy R.; Zheng, Shao-Liang; Chen, Yu-Sheng

    2015-01-01

    This report describes complete practical guidelines and insights for the crystalline sponge method, which have been derived through the first use of synchrotron radiation on these systems, and includes a procedure for faster synthesis of the sponges. These guidelines will be applicable to crystal sponge data collected at synchrotrons or in-house facilities, and will allow researchers to obtain reliable high-quality data and construct chemically and physically sensible models for guest structural determination. A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data, and will allow construction of chemically and physically sensible models for guest structural determination.

  15. Insight into Evaluation Practice: A Content Analysis of Designs and Methods Used in Evaluation Studies Published in North American Evaluation-Focused Journals

    ERIC Educational Resources Information Center

    Christie, Christina A.; Fleischer, Dreolin Nesbitt

    2010-01-01

    To describe the recent practice of evaluation, specifically method and design choices, the authors performed a content analysis on 117 evaluation studies published in eight North American evaluation-focused journals for a 3-year period (2004-2006). The authors chose this time span because it follows the scientifically based research (SBR)…

  16. [Practical risk analysis].

    PubMed

    Lisbona, A; Valero, M

    2015-10-01

    Risk analysis is typically considered from two complementary points of view: predictive analysis performed prior, and retrospective analysis, which follows the internal reporting of adverse situations or malfunctions, both on the organizational and material or human aspects. The purpose of these additional analyzes is to ensure that planned or implemented measures allow to keep risks to a level deemed tolerable or acceptable at a given time and in a given situation. Where a risk is deemed unacceptable, risk reduction measures should be considered (prevention, limiting the consequences and protection). PMID:26362221

  17. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory-- Determination of Dissolved Organic Carbon in Water by High Temperature Catalytic Oxidation, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Bird, Susan M.; Fram, Miranda S.; Crepeau, Kathryn L.

    2003-01-01

    An analytical method has been developed for the determination of dissolved organic carbon concentration in water samples. This method includes the results of the tests used to validate the method and the quality-control practices used for dissolved organic carbon analysis. Prior to analysis, water samples are filtered to remove suspended particulate matter. A Shimadzu TOC-5000A Total Organic Carbon Analyzer in the nonpurgeable organic carbon mode is used to analyze the samples by high temperature catalytic oxidation. The analysis usually is completed within 48 hours of sample collection. The laboratory reporting level is 0.22 milligrams per liter.

  18. Systematic Review and Meta-Analysis of Studies Evaluating Diagnostic Test Accuracy: A Practical Review for Clinical Researchers-Part II. Statistical Methods of Meta-Analysis

    PubMed Central

    Lee, Juneyoung; Kim, Kyung Won; Choi, Sang Hyun; Huh, Jimi

    2015-01-01

    Meta-analysis of diagnostic test accuracy studies differs from the usual meta-analysis of therapeutic/interventional studies in that, it is required to simultaneously analyze a pair of two outcome measures such as sensitivity and specificity, instead of a single outcome. Since sensitivity and specificity are generally inversely correlated and could be affected by a threshold effect, more sophisticated statistical methods are required for the meta-analysis of diagnostic test accuracy. Hierarchical models including the bivariate model and the hierarchical summary receiver operating characteristic model are increasingly being accepted as standard methods for meta-analysis of diagnostic test accuracy studies. We provide a conceptual review of statistical methods currently used and recommended for meta-analysis of diagnostic test accuracy studies. This article could serve as a methodological reference for those who perform systematic review and meta-analysis of diagnostic test accuracy studies. PMID:26576107

  19. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: practical guidelines for the crystalline sponge method.

    PubMed

    Ramadhar, Timothy R; Zheng, Shao Liang; Chen, Yu Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal-organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data, and will allow construction of chemically and physically sensible models for guest structural determination. PMID:25537388

  20. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: practical guidelines for the crystalline sponge method

    PubMed Central

    Ramadhar, Timothy R.; Zheng, Shao-Liang; Chen, Yu-Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data, and will allow construction of chemically and physically sensible models for guest structural determination. PMID:25537388

  1. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: Practical guidelines for the crystalline sponge method

    DOE PAGESBeta

    Ramadhar, Timothy R.; Zheng, Shao -Liang; Chen, Yu -Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collectionmore »times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data, and will allow construction of chemically and physically sensible models for guest structural determination.« less

  2. Doing Conversation Analysis: A Practical Guide.

    ERIC Educational Resources Information Center

    ten Have, Paul

    Noting that conversation analysis (CA) has developed into one of the major methods of analyzing speech in the disciplines of communications, linguistics, anthropology and sociology, this book demonstrates in a practical way how to become a conversation analyst. As well as providing an overall introduction to the approach, it focuses on the…

  3. Practical state of health estimation of power batteries based on Delphi method and grey relational grade analysis

    NASA Astrophysics Data System (ADS)

    Sun, Bingxiang; Jiang, Jiuchun; Zheng, Fangdan; Zhao, Wei; Liaw, Bor Yann; Ruan, Haijun; Han, Zhiqiang; Zhang, Weige

    2015-05-01

    The state of health (SOH) estimation is very critical to battery management system to ensure the safety and reliability of EV battery operation. Here, we used a unique hybrid approach to enable complex SOH estimations. The approach hybridizes the Delphi method known for its simplicity and effectiveness in applying weighting factors for complicated decision-making and the grey relational grade analysis (GRGA) for multi-factor optimization. Six critical factors were used in the consideration for SOH estimation: peak power at 30% state-of-charge (SOC), capacity, the voltage drop at 30% SOC with a C/3 pulse, the temperature rises at the end of discharge and charge at 1C; respectively, and the open circuit voltage at the end of charge after 1-h rest. The weighting of these factors for SOH estimation was scored by the 'experts' in the Delphi method, indicating the influencing power of each factor on SOH. The parameters for these factors expressing the battery state variations are optimized by GRGA. Eight battery cells were used to illustrate the principle and methodology to estimate the SOH by this hybrid approach, and the results were compared with those based on capacity and power capability. The contrast among different SOH estimations is discussed.

  4. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: Practical guidelines for the crystalline sponge method

    SciTech Connect

    Ramadhar, Timothy R.; Zheng, Shao -Liang; Chen, Yu -Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data, and will allow construction of chemically and physically sensible models for guest structural determination.

  5. Project evaluation : a practical asset pricing method

    E-print Network

    Jacoby, Henry D.

    1992-01-01

    This paper presents a practical approach to project evaluation using techniques of modern financial economics, with a sample application to oil development under a complex tax system. The method overcomes shortcomings of ...

  6. Exergy analysis: Principles and practice

    SciTech Connect

    Moran, M.J. . Dept. of Mechanical Engineering); Sciubba, E. . Dipt. di Meccanica e Aeronautica)

    1994-04-01

    The importance of the goal of developing systems that effectively use nonrenewable energy resources such as oil, natural gas, and coal is apparent. The method of exergy analysis is well suited for furthering this goal, for it enables the location, type and true magnitude of waste and loss to be determined. Such information can be used to design new systems and to reduce the inefficiency of existing systems. This paper provides a brief survey of both exergy principles and the current literature of exergy analysis with emphasis on areas of application.

  7. A Method for Optimizing Waste Management and Disposal Practices Using a Group-Based Uncertainty Model for the Analysis of Characterization Data - 13191

    SciTech Connect

    Simpson, A.; Clapham, M.; Lucero, R.; West, J.

    2013-07-01

    It is a universal requirement for characterization of radioactive waste, that the consignor shall calculate and report a Total Measurement Uncertainty (TMU) value associated with each of the measured quantities such as nuclide activity. For Non-destructive Assay systems, the TMU analysis is typically performed on an individual container basis. However, in many cases, the waste consignor treats, transports, stores and disposes of containers in groups for example by over-packing smaller containers into a larger container or emplacing containers into groups for final disposal. The current standard practice for container-group data analysis is usually to treat each container as independent and uncorrelated and use a simple summation / averaging method (or in some cases summation of TMU in quadrature) to define the overall characteristics and associated uncertainty in the container group. In reality, many groups of containers are assayed on the same system, so there will be a large degree of co-dependence in the individual uncertainty elements. Many uncertainty terms may be significantly reduced when addressing issues such as the source position and variability in matrix contents over large populations. The systematic terms encompass both inherently 'two-directional' random effects (e.g. variation of source position) and other terms that are 'one-directional' i.e. designed to account for potential sources of bias. An analysis has been performed with population groups of a variety of non-destructive assay platforms in order to define a quantitative mechanism for waste consignors to determine overall TMU for batches of containers that have been assayed on the same system. (authors)

  8. Practical reconstruction method for bioluminescence tomography

    E-print Network

    Wang, Ge

    Practical reconstruction method for bioluminescence tomography Wenxiang Cong1 , Ge Wang1 , DurairajCray5 , Joseph Zabner5 , and Alexander Cong1 1 Bioluminescence Tomography Laboratory, Department City, Iowa 52242, USA ge-wang@ieee.org cong@ct.radiology.uiowa.edu Abstract: Bioluminescence tomography

  9. Evaluation of agricultural best-management practices in the Conestoga River headwaters, Pennsylvania; methods of data collection and analysis and description of study areas

    USGS Publications Warehouse

    Chichester, Douglas C.

    1988-01-01

    The U.S. Geological Survey is conducting a water quality study as part of the nationally implemented Rural Clean Water Program in the headwaters of the Conestoga River, Pennsylvania. The study, which began in 1982, was designed to determine the effect of agricultural best management practices on surface--and groundwater quality. The study was concentrated in four areas within the intensively farmed, carbonate rock terrane located predominately in Lancaster County, Pennsylvania. These areas were divided into three monitoring components: (1) a Regional study area (188 sq mi): (2) a Small Watershed study area (5.82 sq mi); and (3) two field site study areas, Field-Site 1 (22.1 acres) and Field 2 (47.5 acres). The type of water quality data and the methods of data collection and analysis are presented. The monitoring strategy and description of the study areas are discussed. The locations and descriptions for all data collection locations at the four study areas are provided. (USGS)

  10. [Practice marketing. Data analysis of a urological group practice].

    PubMed

    Schneider, T; Schneider, B; Eisenhardt, A; Sperling, H

    2009-07-01

    The urological practice setting in Germany has changed tremendously over the last years. Group practices with two or more urologists working together are becoming more and more popular. At the same time, marketing has become essential even for urologists. To evaluate the patient flow to our group practice, we asked all new patients to fill out a questionnaire (n=2112). We also evaluated the efficacy of our recall system. The analysis showed that patients were 18-93 years old (mean 57 years), 68% being male and 32% female. The largest age group consisted of 41-50-year-olds. The most important reasons for choosing our practice were recommendations by general practitioners in 38%, recommendations by specialists in 11%, and recommendations by friends and relatives in 27%. Five percent of the patients chose the practice because of the Internet home page and 10% because of entries in various phone books. Three percent of the patients came because of newspaper articles about the practice owners, and <1% had attended patient presentations. The Internet was used mainly by 31-40-year-old patients. Our recall system showed an efficacy of 59%. In summary, a good reputation in the medical society as well as in the neighbourhood is still the best advertising for a urological practice. Phone books are increasingly becoming less important, and the Internet is increasingly attractive to the younger population. Recall systems can also be useful for urological practices. PMID:19387608

  11. Practical Considerations for Using Exploratory Factor Analysis in Educational Research

    ERIC Educational Resources Information Center

    Beavers, Amy S.; Lounsbury, John W.; Richards, Jennifer K.; Huck, Schuyler W.; Skolits, Gary J.; Esquivel, Shelley L.

    2013-01-01

    The uses and methodology of factor analysis are widely debated and discussed, especially the issues of rotational use, methods of confirmatory factor analysis, and adequate sample size. The variety of perspectives and often conflicting opinions can lead to confusion among researchers about best practices for using factor analysis. The focus of the…

  12. Practical method for balancing airplane moments

    NASA Technical Reports Server (NTRS)

    Hamburger, H

    1924-01-01

    The present contribution is the sequel to a paper written by Messrs. R. Fuchs, L. Hopf, and H. Hamburger, and proposes to show that the methods therein contained can be practically utilized in computations. Furthermore, the calculations leading up to the diagram of moments for three airplanes, whose performance in war service gave reason for complaint, are analyzed. Finally, it is shown what conclusions can be drawn from the diagram of moments with regard to the defects in these planes and what steps may be taken to remedy them.

  13. Methods for Temporal Analysis 

    E-print Network

    Hannan, Michael T; Tuma, Nancy Brandon

    2015-08-15

    METHODS FOR TEMPORAL ANALYSIS1 Technical Report #68 Michael T. Hannan Nancy Brandon Tuma October 1978 Laboratory for Social Research Stanford University Work on this paper was supported by grants from the National Institute of Education (NIE-G-76...

  14. An Online Forum As a Qualitative Research Method: Practical Issues

    PubMed Central

    Im, Eun-Ok; Chee, Wonshik

    2008-01-01

    Background Despite positive aspects of online forums as a qualitative research method, very little is known about practical issues involved in using online forums for data collection, especially for a qualitative research project. Objectives The purpose of this paper is to describe the practical issues that the researchers encountered in implementing an online forum as a qualitative component of a larger study on cancer pain experience. Method Throughout the study process, the research staff recorded issues ranged from minor technical problems to serious ethical dilemmas as they arose and wrote memos about them. The memos and written records of discussions were reviewed and analyzed using the content analysis suggested by Weber. Results Two practical issues related to credibility were identified: a high response and retention rate and automatic transcripts. An issue related to dependability was the participants’ easy forgetfulness. The issues related to confirmability were difficulties in theoretical saturation and unstandardized computer and Internet jargon. A security issue related to hacking attempts was noted as well. Discussion The analysis of these issues suggests several implications for future researchers who want to use online forums as a qualitative data collection method. PMID:16849979

  15. Achieving integration in mixed methods designs-principles and practices.

    PubMed

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-12-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs-exploratory sequential, explanatory sequential, and convergent-and through four advanced frameworks-multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

  16. Achieving Integration in Mixed Methods Designs—Principles and Practices

    PubMed Central

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-01-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs—exploratory sequential, explanatory sequential, and convergent—and through four advanced frameworks—multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

  17. A collection of research reporting, theoretical analysis, and practical applications in science education: Examining qualitative research methods, action research, educator-researcher partnerships, and constructivist learning theory

    NASA Astrophysics Data System (ADS)

    Hartle, R. Todd

    2007-12-01

    Educator-researcher partnerships are increasingly being used to improve the teaching of science. Chapter 1 provides a summary of the literature concerning partnerships, and examines the justification of qualitative methods in studying these relationships. It also justifies the use of Participatory Action Research (PAR). Empirically-based studies of educator-researcher partnership relationships are rare despite investments in their implementation by the National Science Foundation (NSF) and others. Chapter 2 describes a qualitative research project in which participants in an NSF GK-12 fellowship program were studied using informal observations, focus groups, personal interviews, and journals to identify and characterize the cultural factors that influenced the relationships between the educators and researchers. These factors were organized into ten critical axes encompassing a range of attitudes, behaviors, or values defined by two stereotypical extremes. These axes were: (1) Task Dictates Context vs. Context Dictates Task; (2) Introspection vs. Extroversion; (3) Internal vs. External Source of Success; (4) Prior Planning vs. Implementation Flexibility; (5) Flexible vs. Rigid Time Sense; (6) Focused Time vs. Multi-tasking; (7) Specific Details vs. General Ideas; (8) Critical Feedback vs. Encouragement; (9) Short Procedural vs. Long Content Repetition; and (10) Methods vs. Outcomes are Well Defined. Another ten important stereotypical characteristics, which did not fit the structure of an axis, were identified and characterized. The educator stereotypes were: (1) Rapport/Empathy; (2) Like Kids; (3) People Management; (4) Communication Skills; and (5) Entertaining. The researcher stereotypes were: (1) Community Collaboration; (2) Focus Intensity; (3) Persistent; (4) Pattern Seekers; and (5) Curiosity/Skeptical. Chapter 3 summarizes the research presented in chapter 2 into a practical guide for participants and administrators of educator-researcher partnerships. Understanding how to identify and evaluate constructivist lessons is the first step in promoting and improving constructivism in teaching. Chapter 4 summarizes a theoretically-generated series of practical criteria that define constructivism: (1) Eliciting Prior Knowledge, (2) Creating Cognitive Dissonance, (3) Application of New Knowledge with Feedback, and (4) Reflection on Learning, or Metacognition. These criteria can be used by any practitioner to evaluate the level of constructivism used in a given lesson or activity.

  18. Method of analysis and quality-assurance practices by the U. S. Geological Survey Organic Geochemistry Research Group; determination of four selected mosquito insecticides and a synergist in water using liquid-liquid extraction and gas chrom

    USGS Publications Warehouse

    Zimmerman, L.R.; Strahan, A.P.; Thurman, E.M.

    2001-01-01

    A method of analysis and quality-assurance practices were developed for the determination of four mosquito insecticides (malathion, metho-prene, phenothrin, and resmethrin) and one synergist (piperonyl butoxide) in water. The analytical method uses liquid-liquid extraction (LLE) and gas chromatography/mass spectrometry (GC/MS). Good precision and accuracy were demonstrated in reagent water, urban surface water, and ground water. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 50 nanograms per liter ranged from 68 to 171 percent, with standard deviations in concentrations of 27 nanograms per liter or less. The method detection limit for all compounds was 5.9 nanograms per liter or less for 247-milliliter samples. This method is valuable for acquiring information about the fate and transport of these mosquito insecticides and one synergist in water.

  19. 2008 practice analysis study of hand therapy.

    PubMed

    Dimick, Mary P; Caro, Carla M; Kasch, Mary C; Muenzen, Patricia M; Fullenwider, Lynnlee; Taylor, Patricia A; Landrieu, Keri; Walsh, J Martin

    2009-01-01

    In 2008, the Hand Therapy Certification Commission (HTCC), in consultation with Professional Examination Service, performed a practice analysis study of hand therapy, the fourth in a series of similar studies performed by HTCC over a 23-year period. An updated profile of the domains, tasks, knowledge, and techniques and tools used in hand therapy practice was developed by subject-matter experts representing a broad range of experiences and perspectives. A large-scale online survey of hand therapists from the United States, Canada, Australia, and New Zealand overwhelmingly validated this profile. Additionally, trends in hand therapy practice and education were explored and compared with the previous studies. The results led to the revision of the test specifications for the Hand Therapy Certification Examination; permitted refinement of the definition and scope of hand therapy; identified professional development and continuing education opportunities; and guided HTCC policy decisions regarding exam and recertification eligibility requirements. PMID:19726158

  20. Council on Certification Professional Practice Analysis.

    PubMed

    Zaglaniczny, K L

    1993-06-01

    The CCNA has completed a PPA and will begin implementing its recommendations with the December 1993 certification examination. The results of the PPA provide content validation for the CCNA certification examination. The certification examination is reflective of the knowledge and skill required for entry-level practice. Assessment of this knowledge is accomplished through the use of questions that are based on the areas represented in the content outline. Analysis of the PPA has resulted in changes in the examination content outline and percentages of questions in each area to reflect current entry-level nurse anesthesia practice. The new outline is based on the major domains of knowledge required for nurse anesthesia practice. These changes are justified by the consistency in the responses of the practitioners surveyed. There was overall agreement as to the knowledge and skills related to patient conditions, procedures, agents, techniques, and equipment that an entry-level CRNA must have to practice. Members of the CCNA and Examination Committee will use the revised outline to develop questions for the certification examination. The questions will be focused on the areas identified as requiring high levels of expertise and those that appeared higher in frequency. The PPA survey will be used as a basis for subsequent content validation studies. It will be revised to reflect new knowledge, technology, and techniques related to nurse anesthesia practice. The CCNA has demonstrated its commitment to the certification process through completion of the PPA and implementation of changes in the structure of the examination. PMID:8291387

  1. Method of analysis at the U.S. Geological Survey California Water Science Center, Sacramento Laboratory - determination of haloacetic acid formation potential, method validation, and quality-control practices

    USGS Publications Warehouse

    Zazzi, Barbara C.; Crepeau, Kathryn L.; Fram, Miranda S.; Bergamaschi, Brian A.

    2005-01-01

    An analytical method for the determination of haloacetic acid formation potential of water samples has been developed by the U.S. Geological Survey California Water Science Center Sacramento Laboratory. The haloacetic acid formation potential is measured by dosing water samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine. The haloacetic acids formed are bromochloroacetic acid, bromodichloroacetic acid, dibromochloroacetic acid, dibromoacetic acid, dichloroacetic acid, monobromoacetic acid, monochloroacetic acid, tribromoacetic acid, and trichloroacetic acid. They are extracted, methylated, and then analyzed using a gas chromatograph equipped with an electron capture detector. Method validation experiments were performed to determine the method accuracy, precision, and detection limit for each of the compounds. Method detection limits for these nine haloacetic acids ranged from 0.11 to 0.45 microgram per liter. Quality-control practices include the use of blanks, quality-control samples, calibration verification standards, surrogate recovery, internal standard, matrix spikes, and duplicates.

  2. Airphoto analysis of erosion control practices

    NASA Technical Reports Server (NTRS)

    Morgan, K. M.; Morris-Jones, D. R.; Lee, G. B.; Kiefer, R. W.

    1980-01-01

    The Universal Soil Loss Equation (USLE) is a widely accepted tool for erosion prediction and conservation planning. In this study, airphoto analysis of color and color infrared 70 mm photography at a scale of 1:60,000 was used to determine the erosion control practice factor in the USLE. Information about contour tillage, contour strip cropping, and grass waterways was obtained from aerial photography for Pheasant Branch Creek watershed in Dane County, Wisconsin.

  3. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1929-01-01

    This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

  4. PRACTICAL STEREOLOGICAL METHODS FOR MORPHOMETRIC CYTOLOGY

    PubMed Central

    Weibel, Ewald R.; Kistler, Gonzague S.; Scherle, Walter F.

    1966-01-01

    Stereological principles provide efficient and reliable tools for the determination of quantitative parameters of tissue structure on sections. Some principles which allow the estimation of volumetric ratios, surface areas, surface-to-volume ratios, thicknesses of tissue or cell sheets, and the number of structures are reviewed and presented in general form; means for their practical application in electron microscopy are outlined. The systematic and statistical errors involved in such measurements are discussed. PMID:5338131

  5. Standard practice for digital imaging and communication nondestructive evaluation (DICONDE) for computed radiography (CR) test methods

    E-print Network

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice facilitates the interoperability of computed radiography (CR) imaging and data acquisition equipment by specifying image data transfer and archival storage methods in commonly accepted terms. This practice is intended to be used in conjunction with Practice E2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see http://medical.nema.org), an international standard for image data acquisition, review, storage and archival storage. The goal of Practice E2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE results on any system conforming to the DICONDE standard. Toward that end, Practice E2339 provides a data dictionary and a set of information modules that are applicable to all NDE modalities. This practice supplements Practice E2339 by providing information objec...

  6. A practical method for sensor absolute calibration.

    PubMed

    Meisenholder, G W

    1966-04-01

    This paper describes a method of performing sensor calibrations using an NBS standard of spectral irradiance. The method shown, among others, was used for calibration of the Mariner IV Canopus sensor. Agreement of inflight response to preflight calibrations performed by this technique has been found to be well within 10%. PMID:20048890

  7. Scenistic Methods for Training: Applications and Practice

    ERIC Educational Resources Information Center

    Lyons, Paul R.

    2011-01-01

    Purpose: This paper aims to complement an earlier article (2010) in "Journal of European Industrial Training" in which the description and theory bases of scenistic methods were presented. This paper also offers a description of scenistic methods and information on theory bases. However, the main thrust of this paper is to describe, give suggested…

  8. A Practical Guide to Immunoassay Method Validation.

    PubMed

    Andreasson, Ulf; Perret-Liaudet, Armand; van Waalwijk van Doorn, Linda J C; Blennow, Kaj; Chiasserini, Davide; Engelborghs, Sebastiaan; Fladby, Tormod; Genc, Sermin; Kruse, Niels; Kuiperij, H Bea; Kulic, Luka; Lewczuk, Piotr; Mollenhauer, Brit; Mroczko, Barbara; Parnetti, Lucilla; Vanmechelen, Eugeen; Verbeek, Marcel M; Winblad, Bengt; Zetterberg, Henrik; Koel-Simmelink, Marleen; Teunissen, Charlotte E

    2015-01-01

    Biochemical markers have a central position in the diagnosis and management of patients in clinical medicine, and also in clinical research and drug development, also for brain disorders, such as Alzheimer's disease. The enzyme-linked immunosorbent assay (ELISA) is frequently used for measurement of low-abundance biomarkers. However, the quality of ELISA methods varies, which may introduce both systematic and random errors. This urges the need for more rigorous control of assay performance, regardless of its use in a research setting, in clinical routine, or drug development. The aim of a method validation is to present objective evidence that a method fulfills the requirements for its intended use. Although much has been published on which parameters to investigate in a method validation, less is available on a detailed level on how to perform the corresponding experiments. To remedy this, standard operating procedures (SOPs) with step-by-step instructions for a number of different validation parameters is included in the present work together with a validation report template, which allow for a well-ordered presentation of the results. Even though the SOPs were developed with the intended use for immunochemical methods and to be used for multicenter evaluations, most of them are generic and can be used for other technologies as well. PMID:26347708

  9. A Practical Guide to Immunoassay Method Validation

    PubMed Central

    Andreasson, Ulf; Perret-Liaudet, Armand; van Waalwijk van Doorn, Linda J. C.; Blennow, Kaj; Chiasserini, Davide; Engelborghs, Sebastiaan; Fladby, Tormod; Genc, Sermin; Kruse, Niels; Kuiperij, H. Bea; Kulic, Luka; Lewczuk, Piotr; Mollenhauer, Brit; Mroczko, Barbara; Parnetti, Lucilla; Vanmechelen, Eugeen; Verbeek, Marcel M.; Winblad, Bengt; Zetterberg, Henrik; Koel-Simmelink, Marleen; Teunissen, Charlotte E.

    2015-01-01

    Biochemical markers have a central position in the diagnosis and management of patients in clinical medicine, and also in clinical research and drug development, also for brain disorders, such as Alzheimer’s disease. The enzyme-linked immunosorbent assay (ELISA) is frequently used for measurement of low-abundance biomarkers. However, the quality of ELISA methods varies, which may introduce both systematic and random errors. This urges the need for more rigorous control of assay performance, regardless of its use in a research setting, in clinical routine, or drug development. The aim of a method validation is to present objective evidence that a method fulfills the requirements for its intended use. Although much has been published on which parameters to investigate in a method validation, less is available on a detailed level on how to perform the corresponding experiments. To remedy this, standard operating procedures (SOPs) with step-by-step instructions for a number of different validation parameters is included in the present work together with a validation report template, which allow for a well-ordered presentation of the results. Even though the SOPs were developed with the intended use for immunochemical methods and to be used for multicenter evaluations, most of them are generic and can be used for other technologies as well. PMID:26347708

  10. Practice-Near and Practice-Distant Methods in Human Services Research

    ERIC Educational Resources Information Center

    Froggett, Lynn; Briggs, Stephen

    2012-01-01

    This article discusses practice-near research in human services, a cluster of methodologies that may include thick description, intensive reflexivity, and the study of emotional and relational processes. Such methods aim to get as near as possible to experiences at the relational interface between institutions and the practice field.…

  11. Multi-criteria decision analysis: Limitations, pitfalls, and practical difficulties

    SciTech Connect

    Kujawski, Edouard

    2003-02-01

    The 2002 Winter Olympics women's figure skating competition is used as a case study to illustrate some of the limitations, pitfalls, and practical difficulties of Multi-Criteria Decision Analysis (MCDA). The paper compares several widely used models for synthesizing the multiple attributes into a single aggregate value. The various MCDA models can provide conflicting rankings of the alternatives for a common set of information even under states of certainty. Analysts involved in MCDA need to deal with the following challenging tasks: (1) selecting an appropriate analysis method, and (2) properly interpreting the results. An additional trap is the availability of software tools that implement specific MCDA models that can beguile the user with quantitative scores. These conclusions are independent of the decision domain and they should help foster better MCDA practices in many fields including systems engineering trade studies.

  12. Methods of analysis and quality-assurance practices of the U.S. Geological Survey organic laboratory, Sacramento, California; determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Domagalski, Joseph L.; Kuivila, Kathryn M.

    1994-01-01

    Analytical method and quality-assurance practices were developed for a study of the fate and transport of pesticides in the Sacramento-San Joaquin Delta and the Sacramento and San Joaquin River. Water samples were filtered to remove suspended parti- culate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide, and the pesticides were eluted with three 2-milliliter aliquots of hexane:diethyl ether (1:1). The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for analytes determined per 1,500-milliliter samples ranged from 0.006 to 0.047 microgram per liter. Recoveries ranged from 47 to 89 percent for 12 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.05 and 0.26 microgram per liter. The method was modified to improve the pesticide recovery by reducing the sample volume to 1,000 milliliters. Internal standards were added to improve quantitative precision and accuracy. The analysis also was expanded to include a total of 21 pesticides. The method detection limits for 1,000-milliliter samples ranged from 0.022 to 0.129 microgram per liter. Recoveries ranged from 38 to 128 percent for 21 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.10 and 0.75 microgram per liter.

  13. Practice and Evaluation of Blended Learning with Cross-Cultural Distance Learning in a Foreign Language Class: Using Mix Methods Data Analysis

    ERIC Educational Resources Information Center

    Sugie, Satoko; Mitsugi, Makoto

    2014-01-01

    The Information and Communication Technology (ICT) utilization in Chinese as a "second" foreign language has mainly been focused on Learning Management System (LMS), digital material development, and quantitative analysis of learners' grammatical knowledge. There has been little research that has analyzed the effectiveness of…

  14. Standard practice for digital imaging and communication in nondestructive evaluation (DICONDE) for ultrasonic test methods

    E-print Network

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice facilitates the interoperability of ultrasonic imaging equipment by specifying image data transfer and archival storage methods in commonly accepted terms. This document is intended to be used in conjunction with Practice E 2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E 2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see http://medical.nema.org), an international standard for image data acquisition, review, transfer and archival storage. The goal of Practice E 2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE test results on any system conforming to the DICONDE standard. Toward that end, Practice E 2339 provides a data dictionary and set of information modules that are applicable to all NDE modalities. This practice supplements Practice E 2339 by providing information object definitions, information ...

  15. Best Practices for Business and Systems Analysis in Projects

    E-print Network

    Utrecht, Universiteit

    Best Practices for Business and Systems Analysis in Projects Conforming to Enterprise Architecture and Systems Analysis in Projects Conforming to Enterprise Architecture. In: Enterprise Modelling FOR BUSINESS AND SYSTEMS ANALYSIS IN PROJECTS CONFORMING TO ENTERPRISE ARCHITECTURE Ralph Foorthuis and Sjaak

  16. Practical guidelines for B-cell receptor repertoire sequencing analysis.

    PubMed

    Yaari, Gur; Kleinstein, Steven H

    2015-01-01

    High-throughput sequencing of B-cell immunoglobulin repertoires is increasingly being applied to gain insights into the adaptive immune response in healthy individuals and in those with a wide range of diseases. Recent applications include the study of autoimmunity, infection, allergy, cancer and aging. As sequencing technologies continue to improve, these repertoire sequencing experiments are producing ever larger datasets, with tens- to hundreds-of-millions of sequences. These data require specialized bioinformatics pipelines to be analyzed effectively. Numerous methods and tools have been developed to handle different steps of the analysis, and integrated software suites have recently been made available. However, the field has yet to converge on a standard pipeline for data processing and analysis. Common file formats for data sharing are also lacking. Here we provide a set of practical guidelines for B-cell receptor repertoire sequencing analysis, starting from raw sequencing reads and proceeding through pre-processing, determination of population structure, and analysis of repertoire properties. These include methods for unique molecular identifiers and sequencing error correction, V(D)J assignment and detection of novel alleles, clonal assignment, lineage tree construction, somatic hypermutation modeling, selection analysis, and analysis of stereotyped or convergent responses. The guidelines presented here highlight the major steps involved in the analysis of B-cell repertoire sequencing data, along with recommendations on how to avoid common pitfalls. PMID:26589402

  17. Applying community-oriented primary care methods in British general practice: a case study.

    PubMed Central

    Iliffe, Steve; Lenihan, Penny; Wallace, Paul; Drennan, Vari; Blanchard, Martin; Harris, Andrew

    2002-01-01

    BACKGROUND: The '75 and over' assessments built into the 1990 contract for general practice have failed to enthuse primary care teams or make a significant impact on the health of older people. Alternative methods for improving the health of older people living at home are being sought. AIM: To test the feasibility of applying community-oriented primary care methodology to a relatively deprived sub-population of older people in a relatively deprived area. DESIGN OF STUDY: A combination of developmental and triangulation approaches to data analysis. SETTING: Four general practices in an inner London borough. METHOD: A community-oriented primary care approach was used to initiate innovative care for older people, supported financially by the health authority and practically by primary care academics. RESULTS: All four practices identified problems needing attention in the older population, developed different projects focused on particular needs among older people, and tested them in practice. Patient and public involvement were central to the design and implementation processes in only one practice. Innovations were sustained in only one practice, but some were adopted by a primary care group and others extended to a wider group of practices by the health authority. CONCLUSION: A modified community-oriented primary care approach can be used in British general practice, and changes can be promoted that are perceived as valuable by planning bodies. However, this methodology may have more impact at primary care trust level than at practice level. PMID:12171223

  18. Complementing Gender Analysis Methods.

    PubMed

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital. PMID:25941756

  19. Toward a practical approach for ergodicity analysis

    NASA Astrophysics Data System (ADS)

    Wang, H.; Wang, C.; Zhao, Y.; Lin, X.; Yu, C.

    2015-09-01

    It is of importance to perform hydrological forecast using a finite hydrological time series. Most time series analysis approaches presume a data series to be ergodic without justifying this assumption. This paper presents a practical approach to analyze the mean ergodic property of hydrological processes by means of autocorrelation function evaluation and Augmented Dickey Fuller test, a radial basis function neural network, and the definition of mean ergodicity. The mean ergodicity of precipitation processes at the Lanzhou Rain Gauge Station in the Yellow River basin, the Ankang Rain Gauge Station in Han River, both in China, and at Newberry, MI, USA are analyzed using the proposed approach. The results indicate that the precipitations of March, July, and August in Lanzhou, and of May, June, and August in Ankang have mean ergodicity, whereas, the precipitation of any other calendar month in these two rain gauge stations do not have mean ergodicity. The precipitation of February, May, July, and December in Newberry show ergodic property, although the precipitation of each month shows a clear increasing or decreasing trend.

  20. Traditional Methods for Mineral Analysis

    NASA Astrophysics Data System (ADS)

    Ward, Robert E.; Carpenter, Charles E.

    This chapter describes traditional methods for analysis of minerals involving titrimetric and colorimetric procedures, and the use of ion selective electrodes. Other traditional methods of mineral analysis include gravimetric titration (i.e., insoluble forms of minerals are precipitated, rinse, dried, and weighed) and redox reactions (i.e., mineral is part of an oxidation-reduction reaction, and product is quantitated). However, these latter two methods will not be covered because they currently are used little in the food industry. The traditional methods that will be described have maintained widespread usage in the food industry despite the development of more modern instrumentation such as atomic absorption spectroscopy and inductively coupled plasma-atomic emission spectroscopy (Chap. 24). Traditional methods generally require chemicals and equipment that are routinely available in an analytical laboratory and are within the experience of most laboratory technicians. Additionally, traditional methods often form the basis for rapid analysis kits (e.g., Quantab®; for salt determination) that are increasingly in demand. Procedures for analysis of minerals of major nutritional or food processing concern are used for illustrative purposes. For additional examples of traditional methods refer to references (1-6). Slight modifications of these traditional methods are often needed for specific foodstuffs to minimize interferences or to be in the range of analytical performance. For analytical requirements for specific foods see the Official Methods of Analysis of AOAC International (5) and related official methods (6).

  1. Genre Analysis, ESP and Professional Practice

    ERIC Educational Resources Information Center

    Bhatia, Vijay K.

    2008-01-01

    Studies of professional genres and professional practices are invariably seen as complementing each other, in that they not only influence each other but are often co-constructed in specific professional contexts. However, professional genres have often been analyzed in isolation, leaving the study of professional practice almost completely out,…

  2. A PRACTICAL METHOD OF SPONGE CULTURE By H. F. Moore

    E-print Network

    A PRACTICAL METHOD OF SPONGE CULTURE By H. F. Moore Scientific Assistant, United States Bureau sponges from eggs or cuttings B. B. F. 1908-35 545 #12;CONTENTS. Page. Conditions and needs of the sponge fisheriesc , , n u u _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 547 Previous experiments in sponge

  3. A Practical Method for Estimation of Point Light-Sources

    E-print Network

    Martin, Ralph R.

    A Practical Method for Estimation of Point Light-Sources Martin Weber and Roberto Cipolla introduce a general model for point light-sources and show how the pa- rameters of a source at finite of a light-source can be seen as a map that supplies each point in space with a radiance value. The correct

  4. COURSE DESCRIPTION Spring 2014 COURSE NAME Data Analytics: Introduction, Methods and Practical Approaches

    E-print Network

    Toronto, University of

    has given birth to some new areas of data analysis. The terms "predictive analytics", "big data1 COURSE DESCRIPTION­ Spring 2014 COURSE NAME Data Analytics: Introduction, Methods and Practical the notions of "Data Analytics" and provide an overview and hands-on experience of tools that perform

  5. A new practice analysis of hand therapy.

    PubMed

    Muenzen, Patricia M; Kasch, Mary C; Greenberg, Sandra; Fullenwider, Lynnlee; Taylor, Patricia A; Dimick, Mary P

    2002-01-01

    The Hand Therapy Certification Commission, Inc. (HTCC) conducted a role delineation in 2001 to characterize current practice in the profession of hand therapy. Building upon previous HTCC studies of practice (i.e., Chai, Dimick & Kasch, 1987; Roth, Dimick, Kasch, Fullenwider & Taylor, 1996), subject matter experts identified the clinical behaviors, knowledge, and technical skills needed by hand therapists. A large scale survey was conducted with therapists across the United States and Canada who rated the clinical behaviors, knowledge, and technical skills in terms of their relevance to practice, and provided information about their own patient populations. A high survey return rate (72%) was indicative of the professional commitment of CHTs to their profession. Results of the survey are discussed and practice trends are identified. A new test outline for the Hand Therapy Certification Examination was created based on the results of the survey, and the 1987 Definition and Scope of Hand Therapy was revised. PMID:12206324

  6. [Good Practice of Secondary Data Analysis (GPS): guidelines and recommendations].

    PubMed

    Swart, E; Gothe, H; Geyer, S; Jaunzeme, J; Maier, B; Grobe, T G; Ihle, P

    2015-02-01

    In 2005, the Working Group for the Survey and Utilisation of Secondary Data (AGENS) of the German Society for Social Medicine and Prevention (DGSMP) and the German Society for Epidemiology (DGEpi) first published "Good Practice in Secondary Data Analysis (GPS)" formulating a standard for conducting secondary data analyses. GPS is intended as a guide for planning and conducting analyses and can provide a basis for contracts between data owners. The domain of these guidelines does not only include data routinely gathered by statutory health insurance funds and further statutory social insurance funds, but all forms of secondary data. The 11 guidelines range from ethical principles and study planning through quality assurance measures and data preparation to data privacy, contractual conditions and responsible communication of analytical results. They are complemented by explanations and practical assistance in the form of recommendations. GPS targets all persons directing their attention to secondary data, their analysis and interpretation from a scientific point of view and by employing scientific methods. This includes data owners. Furthermore, GPS is suitable to assess scientific publications regarding their quality by authors, referees and readers. In 2008, the first version of GPS was evaluated and revised by members of AGENS and the Epidemiological Methods Working Group of DGEpi, DGSMP and GMDS including other epidemiological experts and had then been accredited as implementation regulations of Good Epidemiological Practice (GEP). Since 2012, this third version of GPS is on hand and available for downloading from the DGEpi website at no charge. Especially linguistic specifications have been integrated into the current revision; its internal consistency was increased. With regards to contents, further recommendations concerning the guideline on data privacy have been added. On the basis of future developments in science and data privacy, further revisions will follow. PMID:25622207

  7. Methods for genetic linkage analysis using trisomies

    SciTech Connect

    Feingold, E.; Lamb, N.E.; Sherman, S.L.

    1995-02-01

    Certain genetic disorders are rare in the general population, but more common in individuals with specific trisomies. Examples of this include leukemia and duodenal atresia in trisomy 21. This paper presents a linkage analysis method for using trisomic individuals to map genes for such traits. It is based on a very general gene-specific dosage model that posits that the trait is caused by specific effects of different alleles at one or a few loci and that duplicate copies of {open_quotes}susceptibility{close_quotes} alleles inherited from the nondisjoining parent give increased likelihood of having the trait. Our mapping method is similar to identity-by-descent-based mapping methods using affected relative pairs and also to methods for mapping recessive traits using inbred individuals by looking for markers with greater than expected homozygosity by descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected homozygosity in the chromosomes inherited from the nondisjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the trait gene, a confidence interval for that distance, and methods for computing power and sample sizes. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers and how to test candidate genes. 20 refs., 5 figs., 1 tab.

  8. A practical method of estimating energy expenditure during tennis play.

    PubMed

    Novas, A M P; Rowbottom, D G; Jenkins, D G

    2003-03-01

    This study aimed to develop a practical method of estimating energy expenditure (EE) during tennis. Twenty-four elite female tennis players first completed a tennis-specific graded test in which five different Intensity levels were applied randomly. Each intensity level was intended to simulate a "game" of singles tennis and comprised six 14 s periods of activity alternated with 20 s of active rest. Oxygen consumption (VO2) and heart rate (HR) were measured continuously and each player's rate of perceived exertion (RPE) was recorded at the end of each intensity level. Rate of energy expenditure (EE(VO2)) during the test was calculated using the sum of VO2 during play and the 'O2 debt' during recovery, divided by the duration of the activity. There were significant individual linear relationships between EE(VO2) and RPE, EE(VO2) and HR (r > or = 0.89 & r > or = 0.93; p < 0.05). On a second occasion, six players completed a 60-min singles tennis match during which VO2, HR and RPE were recorded; EE(VO2) was compared with EE predicted from the previously derived RPE and HR regression equations. Analysis found that EE(VO2) was overestimated by EE(RPE) (92 +/- 76 kJ x h(-1)) and EE(HR) (435 +/- 678 kJ x h(-1)), but the error of estimation for EE(RPE) (t = -3.01; p = 0.03) was less than 5% whereas for EE(HR) such error was 20.7%. The results of the study show that RPE can be used to estimate the energetic cost of playing tennis. PMID:12801209

  9. [Complaint analysis derived from surgical practice].

    PubMed

    Fajardo-Dolci, Germán; Rodríguez-Suárez, Francisco Javier; Campos-Castolo, Esther Mahuina; Carrillo-Jaimes, Arturo; Zavala-Suárez, Etelvina; Aguirre-Gas, Héctor Gerardo

    2009-01-01

    This study reports on the analysis of medical complaints presented to the National Commission on Medical Arbitration (Comisión Nacional de Arbitraje Médico, CONAMED) between June 1996 and December 2007 to determine its magnitude and to identify the causes of safety problems in medical care. Out of 182,407 complaints presented to CONAMED, 87% were resolved by the Office of Orientation and Management. The remaining 18,443 complaints were presented to the Council Directorate. Of those cases, 48% were resolved by an agreement between the complainants and the physicians, 31% were not resolved by this method, and 3% were irresolute complaints. The highest frequency of complaints was registered in the Federal District (Distrito Federal) and the State of México (Estado de México), mainly corresponding to social security institutions and private hospitals. Among the nine most frequently involved specialties, six were surgical specialties. Malpractice was identified in 25% of all cases. The principal demands of those making complaints were the refunding of expenses in patient medical care (51%) and indemnification (40%) and, in those, the average amount of payments was 4.6 times greater. Due to the incidence of medical complaints, it was reasonable to investigate the causes and to take preventive and corrective actions required for its decrease. It was proposed to the Mexican Academy of Surgery that this organization should use their educational leadership and assume the vanguard in the dissemination and promotion of the WHO plan "Safe Surgery Saves Lives" and the implementation in Mexico of the "Surgical Safety Checklist." PMID:19671273

  10. Practical Nursing. Ohio's Competency Analysis Profile.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Vocational Instructional Materials Lab.

    Developed through a modified DACUM (Developing a Curriculum) process involving business, industry, labor, and community agency representatives in Ohio, this document is a comprehensive and verified employer competency profile for practical nursing. The list contains units (with and without subunits), competencies, and competency builders that…

  11. An Analysis of Optometric Practices in Rural Alabama.

    ERIC Educational Resources Information Center

    Wild, Bradford W.; Maisiak, Richard

    1981-01-01

    Twenty-nine Alabama optometric practices were studied using an optometrist survey, one-week patient flow analysis, and audit of patient records. Results indicate some special facets of the rural practices that may require a different kind of educational preparation. (MSE)

  12. Risk analysis in the valuation of medical practices.

    PubMed

    Nanda, S; Miller, A

    1996-01-01

    Risk analysis techniques used in valuing medical practices are discussed. Employing these techniques provides superior information to the analyst than using a single estimate for uncertain variables. Examples of scenario analysis, sensitivity analysis, and simulation are presented. These tools can be easily implemented with spreadsheet software. PMID:8922962

  13. Intelligent Best Practices Analysis Shahab D. Mohaghegh, Ph.D.

    E-print Network

    Mohaghegh, Shahab

    of intelligent systems that includes artificial neural networks, genetic #12;Intelligent Best Practices Analysis be used in business and engineering decision making. Many companies in the oil and gas industry have been

  14. Landscape analysis: Theoretical considerations and practical needs

    USGS Publications Warehouse

    Godfrey, A.E.; Cleaves, E.T.

    1991-01-01

    Numerous systems of land classification have been proposed. Most have led directly to or have been driven by an author's philosophy of earth-forming processes. However, the practical need of classifying land for planning and management purposes requires that a system lead to predictions of the results of management activities. We propose a landscape classification system composed of 11 units, from realm (a continental mass) to feature (a splash impression). The classification concerns physical aspects rather than economic or social factors; and aims to merge land inventory with dynamic processes. Landscape units are organized using a hierarchical system so that information may be assembled and communicated at different levels of scale and abstraction. Our classification uses a geomorphic systems approach that emphasizes the geologic-geomorphic attributes of the units. Realm, major division, province, and section are formulated by subdividing large units into smaller ones. For the larger units we have followed Fenneman's delineations, which are well established in the North American literature. Areas and districts are aggregated into regions and regions into sections. Units smaller than areas have, in practice, been subdivided into zones and smaller units if required. We developed the theoretical framework embodied in this classification from practical applications aimed at land use planning and land management in Maryland (eastern Piedmont Province near Baltimore) and Utah (eastern Uinta Mountains). ?? 1991 Springer-Verlag New York Inc.

  15. Practical methods to improve the development of computational software

    SciTech Connect

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-07-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  16. A Practical Introduction to Analysis and Synthesis

    ERIC Educational Resources Information Center

    Williams, R. D.; Cosart, W. P.

    1976-01-01

    Discusses an introductory chemical engineering course in which mathematical models are used to analyze experimental data. Concepts illustrated include dimensional analysis, scaleup, heat transfer, and energy conservation. (MLH)

  17. Seismic Assessment of Structures by a Practice-Oriented Method

    NASA Astrophysics Data System (ADS)

    Fajfar, P.

    A relatively simple seismic analysis technique based on the pushover analysis of a multi-degree-of-freedom model and the response spectrum analysis of an equivalent single-degree-of-freedom system, called the N2 method, has been developed at the University of Ljubljana and implemented in the European standard Eurocode 8. The method is formulated in the acceleration —displacement format, which enables the visual interpretation of the procedure and of the relations between the basic quantities controlling the seismic response. Its basic variant was restricted to planar structures. Recently the applicability of the method has been extended to plan-asymmetric buildings, which require a 3D structural model. In the paper, the N2 method is summarized and applied to two test examples.

  18. Method of multivariate spectral analysis

    DOEpatents

    Keenan, Michael R.; Kotula, Paul G.

    2004-01-06

    A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).

  19. Methods for genetic linkage analysis using trisomies

    SciTech Connect

    Feingold, E.; Lamb, N.E.; Sherman, S.L.

    1994-09-01

    Certain genetic disorders (e.g. congenital cataracts, duodenal atresia) are rare in the general population, but more common in people with Down`s syndrome. We present a method for using individuals with trisomy 21 to map genes for such traits. Our methods are analogous to methods for mapping autosomal dominant traits using affected relative pairs by looking for markers with greater than expected identity-by-descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected reduction to homozygosity in the chromosomes inherited form the non-disjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the gene, a confidence interval for that distance, and methods for computing power and sample sizes. The methods are described in the context of gene-dosage model for the etiology of the disorder, but can be extended to other models. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers, how to test candidate genes, and how to handle the effect of reduced recombination associated with maternal meiosis I non-disjunction.

  20. Tetrad Analysis: A Practical Demonstration Using Simple Models.

    ERIC Educational Resources Information Center

    Gow, Mary M.; Nicholl, Desmond S. T.

    1988-01-01

    Uses simple models to illustrate the principles of this genetic method of mapping gene loci. Stresses that this system enables a practical approach to be used with students who experience difficulty in understanding the concepts involved. (CW)

  1. Method of photon spectral analysis

    DOEpatents

    Gehrke, Robert J. (Idaho Falls, ID); Putnam, Marie H. (Idaho Falls, ID); Killian, E. Wayne (Idaho Falls, ID); Helmer, Richard G. (Idaho Falls, ID); Kynaston, Ronnie L. (Blackfoot, ID); Goodwin, Scott G. (Idaho Falls, ID); Johnson, Larry O. (Pocatello, ID)

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  2. Method of photon spectral analysis

    DOEpatents

    Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

    1993-04-27

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

  3. A practical method to evaluate radiofrequency exposure of mast workers.

    PubMed

    Alanko, Tommi; Hietanen, Maila

    2008-01-01

    Assessment of occupational exposure to radiofrequency (RF) fields in telecommunication transmitter masts is a challenging task. For conventional field strength measurements using manually operated instruments, it is difficult to document the locations of measurements while climbing up a mast. Logging RF dosemeters worn by the workers, on the other hand, do not give any information about the location of the exposure. In this study, a practical method was developed and applied to assess mast workers' exposure to RF fields and the corresponding location. This method uses a logging dosemeter for personal RF exposure evaluation and two logging barometers to determine the corresponding height of the worker's position on the mast. The procedure is not intended to be used for compliance assessments, but to indicate locations where stricter assessments are needed. The applicability of the method is demonstrated by making measurements in a TV and radio transmitting mast. PMID:19054796

  4. Comparative Lifecycle Energy Analysis: Theory and Practice.

    ERIC Educational Resources Information Center

    Morris, Jeffrey; Canzoneri, Diana

    1992-01-01

    Explores the position that more energy is conserved through recycling secondary materials than is generated from municipal solid waste incineration. Discusses one component of a lifecycle analysis--a comparison of energy requirements for manufacturing competing products. Includes methodological issues, energy cost estimates, and difficulties…

  5. Exhaled breath analysis: physical methods, instruments, and medical diagnostics

    NASA Astrophysics Data System (ADS)

    Vaks, V. L.; Domracheva, E. G.; Sobakinskaya, E. A.; Chernyaeva, M. B.

    2014-07-01

    This paper reviews the analysis of exhaled breath, a rapidly growing field in noninvasive medical diagnostics that lies at the intersection of physics, chemistry, and medicine. Current data are presented on gas markers in human breath and their relation to human diseases. Various physical methods for breath analysis are described. It is shown how measurement precision and data volume requirements have stimulated technological developments and identified the problems that have to be solved to put this method into clinical practice.

  6. Standard practice for digital imaging and communication in nondestructive evaluation (DICONDE) for digital radiographic (DR) test methods

    E-print Network

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice facilitates the interoperability of digital X-ray imaging equipment by specifying image data transfer and archival methods in commonly accepted terms. This document is intended to be used in conjunction with Practice E2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see http://medical.nema.org), an international standard for image data acquisition, review, storage and archival storage. The goal of Practice E2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE results on any system conforming to the DICONDE standard. Toward that end, Practice E2339 provides a data dictionary and a set of information modules that are applicable to all NDE modalities. This practice supplements Practice E2339 by providing information object definitions, information modules and a ...

  7. Standard practice for digital imaging and communication in nondestructive evaluation (DICONDE) for X-ray computed tomography (CT) test methods

    E-print Network

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice facilitates the interoperability of X-ray computed tomography (CT) imaging equipment by specifying image data transfer and archival storage methods in commonly accepted terms. This document is intended to be used in conjunction with Practice E2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see http://medical.nema.org), an international standard for image data acquisition, review, storage and archival storage. The goal of Practice E2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE test results on any system conforming to the DICONDE standard. Toward that end, Practice E2339 provides a data dictionary and a set of information modules that are applicable to all NDE modalities. This practice supplements Practice E2339 by providing information object definitio...

  8. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...articles involving unfair methods of competition or practices. 12.39 Section...SPECIAL CLASSES OF MERCHANDISE Unfair Competition § 12.39 Imported articles involving unfair methods of competition or practices. (a)...

  9. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...articles involving unfair methods of competition or practices. 12.39 Section...SPECIAL CLASSES OF MERCHANDISE Unfair Competition § 12.39 Imported articles involving unfair methods of competition or practices. (a)...

  10. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...articles involving unfair methods of competition or practices. 12.39 Section...SPECIAL CLASSES OF MERCHANDISE Unfair Competition § 12.39 Imported articles involving unfair methods of competition or practices. (a)...

  11. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...articles involving unfair methods of competition or practices. 12.39 Section...SPECIAL CLASSES OF MERCHANDISE Unfair Competition § 12.39 Imported articles involving unfair methods of competition or practices. (a)...

  12. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...articles involving unfair methods of competition or practices. 12.39 Section...SPECIAL CLASSES OF MERCHANDISE Unfair Competition § 12.39 Imported articles involving unfair methods of competition or practices. (a)...

  13. Voltametric analysis apparatus and method

    DOEpatents

    Almon, Amy C. (410 Waverly Dr., Augusta, GA 30909)

    1993-01-01

    An apparatus and method for electrochemical analysis of elements in solution. An auxiliary electrode 14, a reference electrode 18, and five working electrodes 20, 22, 26, 28, and 30 are positioned in a container 12 containing a sample solution 34. The working electrodes are spaced apart evenly from each other and auxiliary electrode 14 to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between auxiliary electrode 14 and each of the working electrodes 20, 22, 26, 28, and 30. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in sample solution 34 and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  14. Exergy method of energy systems analysis

    SciTech Connect

    Ahern, J.E.

    1980-01-01

    This book presents a practical approach for applying the second law of thermodynamics to energy-related systems. The approach has been used extensively in the Soviet Union and in Europe. (The appendix includes translations of Russian papers that discuss applications of the approach). Why this concern for the second law. Energy-related systems commonly have been designed and evaluated using first law heat balances. But such calculations neglect the variation in the quality of the energy throughout a system. As a result, losses and inefficiencies are not evaluated realistically. Therefore, effective energy-conservation measures and improvements in system performance and efficiency cannot be accomplished by the heat-balance method alone. Something more is required. In the method for evaluating an energy-related system described in this book, the quality of the energy at any point in the system is determined. This value, called exergy, is the actual amount of work that can be performed by a fluid or mass relative to a zero exergy or a state reference condition. The exergy method shows the real changes in the work of the system, process by process. Losses then can be assessed after the useful work can be accounted for. Data already available from conventional first-law heat balance provide input to exergy calculations. The method of analysis is described by a system analysis approach - used for preliminary evaluation and for single analyses using calculators or computers without iterative cases - and by a process block approach that is advantageous for computer analysis of system designs when parametric evaluations are made.

  15. Analysis of the media literacy theories and practices of the Pauline Centre for Media Studies and the Presbyterian Media Mission 

    E-print Network

    Tinker, Andrew

    2010-01-01

    The purpose of this dissertation is to reflect theologically upon and analyze the development of media literacy practices for the Christian church, through an analysis of the theories and the primary methods of the Pauline ...

  16. SAR/QSAR methods in public health practice

    SciTech Connect

    Demchuk, Eugene Ruiz, Patricia; Chou, Selene; Fowler, Bruce A.

    2011-07-15

    Methods of (Quantitative) Structure-Activity Relationship ((Q)SAR) modeling play an important and active role in ATSDR programs in support of the Agency mission to protect human populations from exposure to environmental contaminants. They are used for cross-chemical extrapolation to complement the traditional toxicological approach when chemical-specific information is unavailable. SAR and QSAR methods are used to investigate adverse health effects and exposure levels, bioavailability, and pharmacokinetic properties of hazardous chemical compounds. They are applied as a part of an integrated systematic approach in the development of Health Guidance Values (HGVs), such as ATSDR Minimal Risk Levels, which are used to protect populations exposed to toxic chemicals at hazardous waste sites. (Q)SAR analyses are incorporated into ATSDR documents (such as the toxicological profiles and chemical-specific health consultations) to support environmental health assessments, prioritization of environmental chemical hazards, and to improve study design, when filling the priority data needs (PDNs) as mandated by Congress, in instances when experimental information is insufficient. These cases are illustrated by several examples, which explain how ATSDR applies (Q)SAR methods in public health practice.

  17. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Method of sample analysis. 58.245 Section 58.245 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF...

  18. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    ERIC Educational Resources Information Center

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we…

  19. [Practice analysis: culture shock and adaptation at work].

    PubMed

    Philippe, Séverine; Didry, Pascale

    2015-12-01

    Constructed as a practice analysis, this personal account presents the reflection undertaken by a student on placement in Ireland thanks to the Erasmus programme. She describes in detail the stages of her adaptation in a hospital setting which is considerably different to her usual environment. PMID:26654501

  20. Flow analysis system and method

    NASA Technical Reports Server (NTRS)

    Hill, Wayne S. (Inventor); Barck, Bruce N. (Inventor)

    1998-01-01

    A non-invasive flow analysis system and method wherein a sensor, such as an acoustic sensor, is coupled to a conduit for transmitting a signal which varies depending on the characteristics of the flow in the conduit. The signal is amplified and there is a filter, responsive to the sensor signal, and tuned to pass a narrow band of frequencies proximate the resonant frequency of the sensor. A demodulator generates an amplitude envelope of the filtered signal and a number of flow indicator quantities are calculated based on variations in amplitude of the amplitude envelope. A neural network, or its equivalent, is then used to determine the flow rate of the flow in the conduit based on the flow indicator quantities.

  1. Practical evaluation of Mung bean seed pasteurization method in Japan.

    PubMed

    Bari, M L; Enomoto, K; Nei, D; Kawamoto, S

    2010-04-01

    The majority of the seed sprout-related outbreaks have been associated with Escherichia coli O157:H7 and Salmonella. Therefore, an effective method for inactivating these organisms on the seeds before sprouting is needed. The current pasteurization method for mung beans in Japan (hot water treatment at 85 degrees C for 10 s) was more effective for disinfecting inoculated E. coli O157:H7, Salmonella, and nonpathogenic E. coli on mung bean seeds than was the calcium hypochlorite treatment (20,000 ppm for 20 min) recommended by the U.S. Food and Drug Administration. Hot water treatment at 85 degrees C for 40 s followed by dipping in cold water for 30 s and soaking in chlorine water (2,000 ppm) for 2 h reduced the pathogens to undetectable levels, and no viable pathogens were found in a 25-g enrichment culture and during the sprouting process. Practical tests using a working pasteurization machine with nonpathogenic E. coli as a surrogate produced similar results. The harvest yield of the treated seed was within the acceptable range. These treatments could be a viable alternative to the presently recommended 20,000-ppm chlorine treatment for mung bean seeds. PMID:20377967

  2. Hybrid methods for rotordynamic analysis

    NASA Technical Reports Server (NTRS)

    Noah, Sherif T.

    1986-01-01

    Effective procedures are presented for the response analysis of the Space Shuttle Main Engine turbopumps under transient loading conditions. Of particular concern is the determination of the nonlinear response of the systems to rotor imbalance in presence of bearing clearances. The proposed procedures take advantage of the nonlinearities involved being localized at only a few rotor/housing coupling joints. The methods include those based on integral formulations for the incremental solutions involving the transition matrices of the rotor and housing. Alternatively, a convolutional representation of the housing displacements at the coupling points is proposed which would allow performing the transient analysis on a reduced model of the housing. The integral approach is applied to small dynamical models to demonstrate the efficiency of the approach. For purposes of assessing the numerical integration results for the nonlinear rotor/housing systems, a numerical harmonic balance procedure is developed to enable determining all possible harmonic, subharmonic, and nonperiodic solutions of the systems. A brief account of the Fourier approach is presented as applied to a two degree of freedon rotor-support system.

  3. Practicing oncology in provincial Mexico: a narrative analysis.

    PubMed

    Hunt, L M

    1994-03-01

    This paper examines the discourse of oncologists treating cancer in a provincial capital of southern Mexico. Based on an analysis of both formal interviews and observations of everyday clinical practice, it examines a set of narrative themes they used to maintain a sense of professionalism and possibility as they endeavored to apply a highly technologically dependent biomedical model in a resource-poor context. They moved between coexisting narrative frameworks as they addressed their formidable problems of translating between theory and practice. In a biomedical narrative frame, they drew on biomedical theory to produce a model of cellular dysfunction and of clinical intervention. However, limited availability of diagnostic and treatment techniques and patients inability or unwillingness to comply, presented serious constraints to the application of this model. They used a practical narrative frame to discuss the socio-economic issues they understood to be underlying these limitations to their clinical practice. They did not experience the incongruity between theory and practice as a continual challenge to their biomedical model, nor to their professional competency. Instead, through a reconciling narrative frame, they mediated this conflict. In this frame, they drew on culturally specific concepts of moral rightness and order to produce accounts that minimized the problem, exculpated themselves and cast blame for failed diagnosis and treatment. By invoking these multiple, coexisting narrative themes, the oncologists sustained an open vision of their work in which deficiencies and impotency were vindicated, and did not stand in the way of clinical practice. PMID:8184335

  4. Analysis Methods of Magnesium Chips

    NASA Astrophysics Data System (ADS)

    Ohmann, Sven; Ditze, André; Scharf, Christiane

    2015-09-01

    The quality of recycled magnesium from chips depends strongly on their exposure to inorganic and organic impurities that are added during the production processes. Different kinds of magnesium chips from these processes were analyzed by several methods. In addition, the accuracy and effectiveness of the methods are discussed. The results show that the chips belong either to the AZ91, AZ31, AM50/60, or AJ62 alloy. Some kinds of chips show deviations from the above-mentioned normations. Different impurities result mainly from transition metals and lime. The water and oil content does not exceed 25%, and the chip size is not more than 4 mm in the diameter. The sieve analysis shows good results for oily and wet chips. The determination of oil and water shows better results for the application of a Soxhlet compared with the addition of lime and vacuum distillation. The most accurate values for the determination of water and oil are obtained by drying at 110°C (for water) and washing with acetone (for oil) by hand.

  5. Analysis Methods of Magnesium Chips

    NASA Astrophysics Data System (ADS)

    Ohmann, Sven; Ditze, André; Scharf, Christiane

    2015-11-01

    The quality of recycled magnesium from chips depends strongly on their exposure to inorganic and organic impurities that are added during the production processes. Different kinds of magnesium chips from these processes were analyzed by several methods. In addition, the accuracy and effectiveness of the methods are discussed. The results show that the chips belong either to the AZ91, AZ31, AM50/60, or AJ62 alloy. Some kinds of chips show deviations from the above-mentioned normations. Different impurities result mainly from transition metals and lime. The water and oil content does not exceed 25%, and the chip size is not more than 4 mm in the diameter. The sieve analysis shows good results for oily and wet chips. The determination of oil and water shows better results for the application of a Soxhlet compared with the addition of lime and vacuum distillation. The most accurate values for the determination of water and oil are obtained by drying at 110°C (for water) and washing with acetone (for oil) by hand.

  6. Perceptions of Weight and Health Practices in Hispanic Children: A Mixed-Methods Study

    PubMed Central

    Foster, Byron Alexander; Hale, Daniel

    2015-01-01

    Background. Perception of weight by parents of obese children may be associated with willingness to engage in behavior change. The relationship between parents' perception of their child's weight and their health beliefs and practices is poorly understood, especially among the Hispanic population which experiences disparities in childhood obesity. This study sought to explore the relationship between perceptions of weight and health beliefs and practices in a Hispanic population. Methods. A cross-sectional, mixed-methods approach was used with semistructured interviews conducted with parent-child (2–5 years old) dyads in a primarily Hispanic, low-income population. Parents were queried on their perceptions of their child's health, health practices, activities, behaviors, and beliefs. A grounded theory approach was used to analyze participants' discussion of health practices and behaviors. Results. Forty parent-child dyads completed the interview. Most (58%) of the parents of overweight and obese children misclassified their child's weight status. The qualitative analysis showed that accurate perception of weight was associated with internal motivation and more concrete ideas of what healthy meant for their child. Conclusions. The qualitative data suggest there may be populations at different stages of readiness for change among parents of overweight and obese children, incorporating this understanding should be considered for interventions. PMID:26379715

  7. Adapting Job Analysis Methodology to Improve Evaluation Practice

    ERIC Educational Resources Information Center

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  8. Analysis of 1263 deaths in four general practices.

    PubMed Central

    Holden, J; O'Donnell, S; Brindley, J; Miles, L

    1998-01-01

    BACKGROUND: The death of a patient is a significant event that occurs often enough in general practice for it to have the potential to tell us much about the care we provide. There are few large series in the literature and we still know little about the collaborative use of this outcome measure. AIM: To determine the pattern of deaths and potentially preventable factors in our practices. METHOD: We completed a standard data collection form after each death in four general practices over a 40-month period. The results were discussed at quarterly meetings. RESULTS: A total of 1263 deaths occurred among our registered patients during the period of the audit. Preventable factors contributing to deaths were considered to be attributable to: patients (40%): mainly cigarette smoking, poor compliance, and alcohol problems; general practice teams (5%): mainly delayed referral, diagnosis and treatment, and failure to prescribe aspirin to patients with vascular disease; hospitals (6%): mainly delayed diagnosis and perceived treatment problems; the environment (3%): mainly falls, principally resulting in fractured neck of femur. CONCLUSION: A simple audit of deaths along the lines that we describe gives important information about the care provided by general practice teams and those in hospital practice. It has both educational value and is a source of ideas for service improvement and further study, particularly when carried out over several years. PMID:9800400

  9. TECHNIQUES FOR MOLECULAR ANALYSIS Practical approaches to plant volatile analysis

    E-print Network

    Tholl, Dorothea

    commercially as flavorings and fragrances, their analysis in the food and perfume industry has a long tradition of VOCs have allowed the monitoring of fast changes in VOC emissions and facilitated in vivo studies

  10. Shapelets - I. A method for image analysis

    NASA Astrophysics Data System (ADS)

    Refregier, Alexandre

    2003-01-01

    We present a new method for the analysis of images, a fundamental task in observational astronomy. It is based on the linear decomposition of each object in the image into a series of localized basis functions of different shapes, which we call `shapelets'. A particularly useful set of complete and orthonormal shapelets is that consisting of weighted Hermite polynomials, which correspond to perturbations around a circular Gaussian. They are also the eigenstates of the two-dimensional quantum harmonic oscillator, and thus allow us to use the powerful formalism developed for this problem. One of their special properties is their invariance under Fourier transforms (up to a rescaling), leading to an analytic form for convolutions. The generator of linear transformations such as translations, rotations, shears and dilatations can be written as simple combinations of raising and lowering operators. We derive analytic expressions for practical quantities, such as the centroid (astrometry), flux (photometry) and radius of the object, in terms of its shapelet coefficients. We also construct polar basis functions which are eigenstates of the angular momentum operator, and thus have simple properties under rotations. As an example, we apply the method to Hubble Space Telescope images, and show that the small number of shapelet coefficients required to represent galaxy images lead to compression factors of about 40 to 90. We discuss applications of shapelets for the archival of large photometric surveys, for weak and strong gravitational lensing and for image deprojection.

  11. Practical Aspects of the Equation-Error Method for Aircraft Parameter Estimation

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene a.

    2006-01-01

    Various practical aspects of the equation-error approach to aircraft parameter estimation were examined. The analysis was based on simulated flight data from an F-16 nonlinear simulation, with realistic noise sequences added to the computed aircraft responses. This approach exposes issues related to the parameter estimation techniques and results, because the true parameter values are known for simulation data. The issues studied include differentiating noisy time series, maximum likelihood parameter estimation, biases in equation-error parameter estimates, accurate computation of estimated parameter error bounds, comparisons of equation-error parameter estimates with output-error parameter estimates, analyzing data from multiple maneuvers, data collinearity, and frequency-domain methods.

  12. Methods used by Dental Practice-Based Research Network (DPBRN) dentists to diagnose dental caries

    PubMed Central

    Gordan, Valeria V.; Riley, Joseph L; Carvalho, Ricardo M.; Snyder, John; Sanderson, James L; Anderson, Mary; Gilbert, Gregg H.

    2010-01-01

    Objectives To (1) identify the methods that dentists in The Dental Practice-Based Research Network (DPBRN) use to diagnose dental caries; (2) quantify their frequency of use; and (3) test the hypothesis that certain dentist and dental practice characteristics are significantly associated with their use. Methods A questionnaire about methods used for caries diagnosis was sent to DPBRN dentists who reported doing at least some restorative dentistry; 522 dentists participated. Questions included use of dental radiographs, dental explorer, laser fluorescence, air-drying, fiber optic devices, and magnification, as used when diagnosing primary, secondary/recurrent, or non-specific caries lesions. Variations on the frequency of their use were tested using multivariate analysis and Bonferroni tests. Results Overall, the dental explorer was the instrument most commonly used to detect primary occlusal caries as well as to detect caries at the margins of existing restorations. In contrast, laser fluorescence was rarely used to help diagnose occlusal primary caries. For proximal caries, radiographs were used to help diagnose 75-100% of lesions by 96% of the DPBRN dentists. Dentists who use radiographs most often to assess proximal surfaces of posterior teeth, were significantly more likely to also report providing a higher percentage of patients with individualized caries prevention (p = .040) and seeing a higher percentage of pediatric patients (p = .001). Conclusion Use of specific diagnostic methods varied substantially. The dental explorer and radiographs are still the most commonly used diagnostic methods. PMID:21488724

  13. Prognostic Analysis System and Methods of Operation

    NASA Technical Reports Server (NTRS)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)

    2014-01-01

    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  14. A practical and sensitive method of quantitating lymphangiogenesis in vivo.

    PubMed

    Majumder, Mousumi; Xin, Xiping; Lala, Peeyush K

    2013-07-01

    To address the inadequacy of current assays, we developed a directed in vivo lymphangiogenesis assay (DIVLA) by modifying an established directed in vivo angiogenesis assay. Silicon tubes (angioreactors) were implanted in the dorsal flanks of nude mice. Tubes contained either growth factor-reduced basement membrane extract (BME)-alone (negative control) or BME-containing vascular endothelial growth factor (VEGF)-D (positive control for lymphangiogenesis) or FGF-2/VEGF-A (positive control for angiogenesis) or a high VEGF-D-expressing breast cancer cell line MDA-MD-468LN (468-LN), or VEGF-D-silenced 468LN. Lymphangiogenesis was detected superficially with Evans Blue dye tracing and measured in the cellular contents of angioreactors by multiple approaches: lymphatic vessel endothelial hyaluronan receptor-1 (Lyve1) protein (immunofluorescence) and mRNA (qPCR) expression and a visual scoring of lymphatic vs blood capillaries with dual Lyve1 (or PROX-11 or Podoplanin)/Cd31 immunostaining in cryosections. Lymphangiogenesis was absent with BME, high with VEGF-D or VEGF-D-producing 468LN cells and low with VEGF-D-silenced 468LN. Angiogenesis was absent with BME, high with FGF-2/VEGF-A, moderate with 468LN or VEGF-D and low with VEGF-D-silenced 468LN. The method was reproduced in a syngeneic murine C3L5 tumor model in C3H/HeJ mice with dual Lyve1/Cd31 immunostaining. Thus, DIVLA presents a practical and sensitive assay of lymphangiogenesis, validated with multiple approaches and markers. It is highly suited to identifying pro- and anti-lymphangiogenic agents, as well as shared or distinct mechanisms regulating lymphangiogenesis vs angiogenesis, and is widely applicable to research in vascular/tumor biology. PMID:23711825

  15. Error analysis in some recent versions of the Fry Method

    NASA Astrophysics Data System (ADS)

    Srivastava, D. C.; Kumar, R.

    2013-12-01

    Fry method is a graphical technique that directly displays the strain ellipse in form of central vacancy on a point distribution, the Fry plot. For accurate strain estimation from the Fry plot, the central vacancy must appear as a sharply focused perfect ellipse. Diffused nature of the central vacancy, common in practice, induces a subjectivity in direct strain estimation from the Fry plot. Several alternative methods, based on the point density contrast, the image analysis, the Delaunay triangulation, or the point distribution analysis exist for objective strain estimation from the Fry plots. Relative merits and limitations of these methods are, however, not yet well-explored and understood. This study compares the accuracy and efficacy of the six methods proposed for objective determination of strain from Fry plots. Our approach consists of; (i) graphical simulation of variously sorted object sets, (ii) distortion of different object sets by known strain in pure shear, simple shear and simultaneous pure-and-simple shear deformations and, (iii) error analysis and comparison of the six methods. Results from more than 1000 tests reveal that the Delaunay triangulation method, the point density contrast methods or the image analysis method are relatively more accurate and versatile. The amount and nature of distortion, or the degree of sorting have little effect on the accuracy of results in these methods. The point distribution analysis methods are successful provided the pre-deformed objects were well-sorted and defined by the specific types of point distribution. Both the Delaunay triangulation method and the image analysis method are more time efficient in comparison to the point distribution analysis methods. The time-efficiency of the density contrast methods is in between these two extremes.

  16. Root Cause Analysis: Methods and Mindsets.

    ERIC Educational Resources Information Center

    Kluch, Jacob H.

    This instructional unit is intended for use in training operations personnel and others involved in scram analysis at nuclear power plants in the techniques of root cause analysis. Four lessons are included. The first lesson provides an overview of the goals and benefits of the root cause analysis method. Root cause analysis techniques are covered…

  17. Canonical Correlation Analysis: An Explanation with Comments on Correct Practice.

    ERIC Educational Resources Information Center

    Thompson, Bruce

    This paper briefly explains the logic underlying the basic calculations employed in canonical correlation analysis. A small hypothetical data set is employed to illustrate that canonical correlation analysis subsumes both univariate and multivariate parametric methods. Several real data sets are employed to illustrate other themes. Three common…

  18. Estimating free-living human energy expenditure: Practical aspects of the doubly labeled water method and its applications

    PubMed Central

    Kazuko, Ishikawa-Takata; Kim, Eunkyung; Kim, Jeonghyun; Yoon, Jinsook

    2014-01-01

    The accuracy and noninvasive nature of the doubly labeled water (DLW) method makes it ideal for the study of human energy metabolism in free-living conditions. However, the DLW method is not always practical in many developing and Asian countries because of the high costs of isotopes and equipment for isotope analysis as well as the expertise required for analysis. This review provides information about the theoretical background and practical aspects of the DLW method, including optimal dose, basic protocols of two- and multiple-point approaches, experimental procedures, and isotopic analysis. We also introduce applications of DLW data, such as determining the equations of estimated energy requirement and validation studies of energy intake. PMID:24944767

  19. Methods of Building Cost Analysis.

    ERIC Educational Resources Information Center

    Building Research Inst., Inc., Washington, DC.

    Presentation of symposium papers includes--(1) a study describing techniques for economic analysis of building designs, (2) three case studies of analysis techniques, (3) procedures for measuring the area and volume of buildings, and (4) an open forum discussion. Case studies evaluate--(1) the thermal economics of building enclosures, (2) an…

  20. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B. (Idaho Falls, ID); Novascone, Stephen R. (Idaho Falls, ID); Wright, Jerry P. (Idaho Falls, ID)

    2012-05-29

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  1. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    SciTech Connect

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  2. Speech analysis by parametric methods: A survey

    NASA Astrophysics Data System (ADS)

    Gueguen, C.

    Speech analysis parametric modeling and parameter extraction methods are summarized. Linear prediction principles and associated fast algorithms are reviewed. Global analysis methods on short time windows, with variable frequency resolution, and additive noise; time evolving methods where a time varying parametric model is adjusted to model the transitions between quasistationary periods; and time adaptive sequential methods using fast algorithms along with a synchronous detection of temporal events are examined.

  3. Coal Field Fire Fighting - Practiced methods, strategies and tactics

    NASA Astrophysics Data System (ADS)

    Wündrich, T.; Korten, A. A.; Barth, U. H.

    2009-04-01

    Subsurface coal fires destroy millions of tons of coal each year, have an immense impact to the ecological surrounding and threaten further coal reservoirs. Due to enormous dimensions a coal seam fire can develop, high operational expenses are needed. As part of the Sino-German coal fire research initiative "Innovative technologies for exploration, extinction and monitoring of coal fires in Northern China" the research team of University of Wuppertal (BUW) focuses on fire extinction strategies and tactics as well as aspects of environmental and health safety. Besides the choice and the correct application of different extinction techniques further factors are essential for the successful extinction. Appropriate tactics, well trained and protected personnel and the choice of the best fitting extinguishing agents are necessary for the successful extinction of a coal seam fire. The chosen strategy for an extinction campaign is generally determined by urgency and importance. It may depend on national objectives and concepts of coal conservation, on environmental protection (e.g. commitment to green house gases (GHG) reductions), national funding and resources for fire fighting (e.g. personnel, infrastructure, vehicles, water pipelines); and computer-aided models and simulations of coal fire development from self ignition to extinction. In order to devise an optimal fire fighting strategy, "aims of protection" have to be defined in a first step. These may be: - directly affected coal seams; - neighboring seams and coalfields; - GHG emissions into the atmosphere; - Returns on investments (costs of fire fighting compared to value of saved coal). In a further step, it is imperative to decide whether the budget shall define the results, or the results define the budget; i.e. whether there are fixed objectives for the mission that will dictate the overall budget, or whether the limited resources available shall set the scope within which the best possible results shall be achieved. For an effective and efficient fire fighting optimal tactics are requiered and can be divided into four fundamental tactics to control fire hazards: - Defense (digging away the coal, so that the coal can not begin to burn; or forming a barrier, so that the fire can not reach the not burning coal), - Rescue the coal (coal mining of a not burning seam), - Attack (active and direct cooling of burning seam), - Retreat (only monitoring till self-extinction of a burning seam). The last one is used when a fire exceeds the organizational and/or technical scope of a mission. In other words, "to control a coal fire" does not automatically and in all situations mean "to extinguish a coal fire". Best-practice tactics or a combination of them can be selected for control of a particular coal fire. For the extinguishing works different extinguishing agents are available. They can be applied by different application techniques and varying distinctive operating expenses. One application method may be the drilling of boreholes from the surface or covering the surface with low permeability soils. The mainly used extinction agents for coal field fire are as followed: Water (with or without additives), Slurry, Foaming mud/slurry, Inert gases, Dry chemicals and materials and Cryogenic agents. Because of its tremendous dimension and its complexity the worldwide challenge of coal fires is absolutely unique - it can only be solved with functional application methods, best fitting strategies and tactics, organisation and research as well as the dedication of the involved fire fighters, who work under extreme individual risks on the burning coal fields.

  4. A Practical Guide to Interpretation of Large Collections of Incident Narratives Using the QUORUM Method

    NASA Technical Reports Server (NTRS)

    McGreevy, Michael W.

    1997-01-01

    Analysis of incident reports plays an important role in aviation safety. Typically, a narrative description, written by a participant, is a central part of an incident report. Because there are so many reports, and the narratives contain so much detail, it can be difficult to efficiently and effectively recognize patterns among them. Recognizing and addressing recurring problems, however, is vital to continuing safety in commercial aviation operations. A practical way to interpret large collections of incident narratives is to apply the QUORUM method of text analysis, modeling, and relevance ranking. In this paper, QUORUM text analysis and modeling are surveyed, and QUORUM relevance ranking is described in detail with many examples. The examples are based on several large collections of reports from the Aviation Safety Reporting System (ASRS) database, and a collection of news stories describing the disaster of TWA Flight 800, the Boeing 747 which exploded in mid- air and crashed near Long Island, New York, on July 17, 1996. Reader familiarity with this disaster should make the relevance-ranking examples more understandable. The ASRS examples illustrate the practical application of QUORUM relevance ranking.

  5. Efficient methods and practical guidelines for simulating isotope effects

    NASA Astrophysics Data System (ADS)

    Ceriotti, Michele; Markland, Thomas E.

    2013-01-01

    The shift in chemical equilibria due to isotope substitution is frequently exploited to obtain insight into a wide variety of chemical and physical processes. It is a purely quantum mechanical effect, which can be computed exactly using simulations based on the path integral formalism. Here we discuss how these techniques can be made dramatically more efficient, and how they ultimately outperform quasi-harmonic approximations to treat quantum liquids not only in terms of accuracy, but also in terms of computational cost. To achieve this goal we introduce path integral quantum mechanics estimators based on free energy perturbation, which enable the evaluation of isotope effects using only a single path integral molecular dynamics trajectory of the naturally abundant isotope. We use as an example the calculation of the free energy change associated with H/D and 16O/18O substitutions in liquid water, and of the fractionation of those isotopes between the liquid and the vapor phase. In doing so, we demonstrate and discuss quantitatively the relative benefits of each approach, thereby providing a set of guidelines that should facilitate the choice of the most appropriate method in different, commonly encountered scenarios. The efficiency of the estimators we introduce and the analysis that we perform should in particular facilitate accurate ab initio calculation of isotope effects in condensed phase systems.

  6. Efficient methods and practical guidelines for simulating isotope effects.

    PubMed

    Ceriotti, Michele; Markland, Thomas E

    2013-01-01

    The shift in chemical equilibria due to isotope substitution is frequently exploited to obtain insight into a wide variety of chemical and physical processes. It is a purely quantum mechanical effect, which can be computed exactly using simulations based on the path integral formalism. Here we discuss how these techniques can be made dramatically more efficient, and how they ultimately outperform quasi-harmonic approximations to treat quantum liquids not only in terms of accuracy, but also in terms of computational cost. To achieve this goal we introduce path integral quantum mechanics estimators based on free energy perturbation, which enable the evaluation of isotope effects using only a single path integral molecular dynamics trajectory of the naturally abundant isotope. We use as an example the calculation of the free energy change associated with H/D and (16)O/(18)O substitutions in liquid water, and of the fractionation of those isotopes between the liquid and the vapor phase. In doing so, we demonstrate and discuss quantitatively the relative benefits of each approach, thereby providing a set of guidelines that should facilitate the choice of the most appropriate method in different, commonly encountered scenarios. The efficiency of the estimators we introduce and the analysis that we perform should in particular facilitate accurate ab initio calculation of isotope effects in condensed phase systems. PMID:23298033

  7. Articulating current service development practices: a qualitative analysis of eleven mental health projects

    PubMed Central

    2014-01-01

    Background The utilisation of good design practices in the development of complex health services is essential to improving quality. Healthcare organisations, however, are often seriously out of step with modern design thinking and practice. As a starting point to encourage the uptake of good design practices, it is important to understand the context of their intended use. This study aims to do that by articulating current health service development practices. Methods Eleven service development projects carried out in a large mental health service were investigated through in-depth interviews with six operation managers. The critical decision method in conjunction with diagrammatic elicitation was used to capture descriptions of these projects. Stage-gate design models were then formed to visually articulate, classify and characterise different service development practices. Results Projects were grouped into three categories according to design process patterns: new service introduction and service integration; service improvement; service closure. Three common design stages: problem exploration, idea generation and solution evaluation - were then compared across the design process patterns. Consistent across projects were a top-down, policy-driven approach to exploration, underexploited idea generation and implementation-based evaluation. Conclusions This study provides insight into where and how good design practices can contribute to the improvement of current service development practices. Specifically, the following suggestions for future service development practices are made: genuine user needs analysis for exploration; divergent thinking and innovative culture for idea generation; and fail-safe evaluation prior to implementation. Better training for managers through partnership working with design experts and researchers could be beneficial. PMID:24438471

  8. Parallel Processable Cryptographic Methods with Unbounded Practical Security.

    ERIC Educational Resources Information Center

    Rothstein, Jerome

    Addressing the problem of protecting confidential information and data stored in computer databases from access by unauthorized parties, this paper details coding schemes which present such astronomical work factors to potential code breakers that security breaches are hopeless in any practical sense. Two procedures which can be used to encode for…

  9. Analysis of flight equipment purchasing practices of representative air carriers

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The process through which representative air carriers decide whether or not to purchase flight equipment was investigated as well as their practices and policies in retiring surplus aircraft. An analysis of the flight equipment investment decision process in ten airlines shows that for the airline industry as a whole, the flight equipment investment decision is in a state of transition from a wholly informal process in earliest years to a much more organized and structured process in the future. Individual air carriers are in different stages with respect to the formality and sophistication associated with the flight equipment investment decision.

  10. Methods for the practical determination of the mechanical strength of tablets--from empiricism to science.

    PubMed

    Podczeck, Fridrun

    2012-10-15

    This review aims to awake an interest in the determination of the tensile strength of tablets of various shapes using a variety of direct and indirect test methods. The United States Pharmacopoeia monograph 1217 (USP35/NF30, 2011) has provided a very good approach to the experimental determination of and standards for the mechanical strength of tablets. Building on this monograph, it is hoped that the detailed account of the various methods provided in this review will encourage industrial and academic scientists involved in the development and manufacture of tablet formulations to take a step forward and determine the tensile strength of tablets, even if these are not simply flat disc-shaped or rectangular. To date there are a considerable number of valid test configurations and stress equations available, catering for many of the various shapes of tablets on the market. The determination of the tensile strength of tablets should hence replace the sole determination of a breaking force, because tensile strength values are more comparable and suggestions for minimum and/or maximum values are available. The review also identifies the gaps that require urgent filling. There is also a need for further analysis using, for example, Finite Element Method, to provide correct stress solutions for tablets of differing shapes, but this also requires practical experiments to find the best loading conditions, and theoretical stress solutions should be verified with practical experiments. PMID:22776803

  11. Text analysis methods, text analysis apparatuses, and articles of manufacture

    DOEpatents

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  12. Nonparametric Bayesian Methods for Manifold Analysis

    E-print Network

    Maggioni, Mauro

    Bayesian Methods for Manifold Analysis #12;Agenda Motivation Nonparametric clustering of data: Dirichlet Methods for Manifold Analysis #12;Agenda Motivation Nonparametric clustering of data: Dirichlet process process Nonparametric feature design & selection: Beta process DP, BP and the mixture of factor analyzers

  13. Assessing Student Perception of Practice Evaluation Knowledge in Introductory Research Methods

    ERIC Educational Resources Information Center

    Baker, Lisa R.; Pollio, David E.; Hudson, Ashley

    2011-01-01

    The authors explored the use of the Practice Evaluation Knowledge Scale (PEKS) to assess student perception of acquisition and retention of practice evaluation knowledge from an undergraduate research methods class. The authors sampled 2 semesters of undergraduate social work students enrolled in an introductory research methods course.…

  14. Investigating the Efficacy of Practical Skill Teaching: A Pilot-Study Comparing Three Educational Methods

    ERIC Educational Resources Information Center

    Maloney, Stephen; Storr, Michael; Paynter, Sophie; Morgan, Prue; Ilic, Dragan

    2013-01-01

    Effective education of practical skills can alter clinician behaviour, positively influence patient outcomes, and reduce the risk of patient harm. This study compares the efficacy of two innovative practical skill teaching methods, against a traditional teaching method. Year three pre-clinical physiotherapy students consented to participate in a…

  15. Methods of DNA methylation analysis.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The purpose of this review was to provide guidance for investigators who are new to the field of DNA methylation analysis. Epigenetics is the study of mitotically heritable alterations in gene expression potential that are not mediated by changes in DNA sequence. Recently, it has become clear that n...

  16. Return on Investment in Electronic Health Records in Primary Care Practices: A Mixed-Methods Study

    PubMed Central

    Sanche, Steven

    2014-01-01

    Background The use of electronic health records (EHR) in clinical settings is considered pivotal to a patient-centered health care delivery system. However, uncertainty in cost recovery from EHR investments remains a significant concern in primary care practices. Objective Guided by the question of “When implemented in primary care practices, what will be the return on investment (ROI) from an EHR implementation?”, the objectives of this study are two-fold: (1) to assess ROI from EHR in primary care practices and (2) to identify principal factors affecting the realization of positive ROI from EHR. We used a break-even point, that is, the time required to achieve cost recovery from an EHR investment, as an ROI indicator of an EHR investment. Methods Given the complexity exhibited by most EHR implementation projects, this study adopted a retrospective mixed-method research approach, particularly a multiphase study design approach. For this study, data were collected from community-based primary care clinics using EHR systems. Results We collected data from 17 primary care clinics using EHR systems. Our data show that the sampled primary care clinics recovered their EHR investments within an average period of 10 months (95% CI 6.2-17.4 months), seeing more patients with an average increase of 27% in the active-patients-to-clinician-FTE (full time equivalent) ratio and an average increase of 10% in the active-patients-to-clinical-support-staff-FTE ratio after an EHR implementation. Our analysis suggests, with a 95% confidence level, that the increase in the number of active patients (P=.006), the increase in the active-patients-to-clinician-FTE ratio (P<.001), and the increase in the clinic net revenue (P<.001) are positively associated with the EHR implementation, likely contributing substantially to an average break-even point of 10 months. Conclusions We found that primary care clinics can realize a positive ROI with EHR. Our analysis of the variances in the time required to achieve cost recovery from EHR investments suggests that a positive ROI does not appear automatically upon implementing an EHR and that a clinic’s ability to leverage EHR for process changes seems to play a role. Policies that provide support to help primary care practices successfully make EHR-enabled changes, such as support of clinic workflow optimization with an EHR system, could facilitate the realization of positive ROI from EHR in primary care practices. PMID:25600508

  17. A report on the CCNA 2007 professional practice analysis.

    PubMed

    Muckle, Timothy J; Apatov, Nathaniel M; Plaus, Karen

    2009-06-01

    The purpose of this column is to present the results of the 2007 Professional Practice Analysis (PPA) of the field of nurse anesthesia, conducted by the Council on Certification of Nurse Anesthetists. The PPA used survey and rating scale methodologies to collect data regarding the relative emphasis of various aspects of the nurse anesthesia knowledge domain and competencies. A total of 3,805 survey responses were analyzed using the Rasch rating scale model, which aggregates and transforms ordinal (rating scale) responses into linear measures of relative importance and frequency. Summaries of respondent demographics and educational and professional background are provided, as well as descriptions of how the survey results are used to develop test specifications. The results of this analysis provide evidence for the content outline and test specifications (content percentages) and thus serve as a basis of content validation for the National Certification Examination. PMID:19645167

  18. Bayesian Methods for Data Analysis

    E-print Network

    Carriquiry, Alicia

    are not directly drawn from biological problems, but still serve to illustrate methodology. · Biggest idea to getBUGS for most of the examples after we discuss computational methods. · All of the programs used to construct examples can be downloaded from www.public.iastate.edu/ alicia. · On my website, go to Teaching

  19. Analysis methods for photovoltaic applications

    SciTech Connect

    1980-01-01

    Because photovoltaic power systems are being considered for an ever-widening range of applications, it is appropriate for system designers to have knowledge of and access to photovoltaic power systems simulation models and design tools. This brochure gives brief descriptions of a variety of such aids and was compiled after surveying both manufacturers and researchers. Services available through photovoltaic module manufacturers are outlined, and computer codes for systems analysis are briefly described. (WHK)

  20. Grounded Theory in Practice: Is It Inherently a Mixed Method?

    ERIC Educational Resources Information Center

    Johnson, R. B.; McGowan, M. W.; Turner, L. A.

    2010-01-01

    We address 2 key points of contention in this article. First, we engage the debate concerning whether particular methods are necessarily linked to particular research paradigms. Second, we briefly describe a mixed methods version of grounded theory (MM-GT). Grounded theory can be tailored to work well in any of the 3 major forms of mixed methods

  1. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    PubMed Central

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or “one-shot,” in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

  2. Microparticle analysis system and method

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    2007-01-01

    A device for analyzing microparticles is provided which includes a chamber with an inlet and an outlet for respectively introducing and dispensing a flowing fluid comprising microparticles, a light source for providing light through the chamber and a photometer for measuring the intensity of light transmitted through individual microparticles. The device further includes an imaging system for acquiring images of the fluid. In some cases, the device may be configured to identify and determine a quantity of the microparticles within the fluid. Consequently, a method for identifying and tracking microparticles in motion is contemplated herein. The method involves flowing a fluid comprising microparticles in laminar motion through a chamber, transmitting light through the fluid, measuring the intensities of the light transmitted through the microparticles, imaging the fluid a plurality of times and comparing at least some of the intensities of light between different images of the fluid.

  3. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1931-01-01

    The revised report includes the chart for the analysis of aircraft accidents, combining consideration of the immediate causes, underlying causes, and results of accidents, as prepared by the special committee, with a number of the definitions clarified. A brief statement of the organization and work of the special committee and of the Committee on Aircraft Accidents; and statistical tables giving a comparison of the types of accidents and causes of accidents in the military services on the one hand and in civil aviation on the other, together with explanations of some of the important differences noted in these tables.

  4. Testing the quasi-absolute method in photon activation analysis

    NASA Astrophysics Data System (ADS)

    Sun, Z. J.; Wells, D.; Starovoitova, V.; Segebade, C.

    2013-04-01

    In photon activation analysis (PAA), relative methods are widely used because of their accuracy and precision. Absolute methods, which are conducted without any assistance from calibration materials, are seldom applied for the difficulty in obtaining photon flux in measurements. This research is an attempt to perform a new absolute approach in PAA - quasi-absolute method - by retrieving photon flux in the sample through Monte Carlo simulation. With simulated photon flux and database of experimental cross sections, it is possible to calculate the concentration of target elements in the sample directly. The QA/QC procedures to solidify the research are discussed in detail. Our results show that the accuracy of the method for certain elements is close to a useful level in practice. Furthermore, the future results from the quasi-absolute method can also serve as a validation technique for experimental data on cross sections. The quasi-absolute method looks promising.

  5. Imaging laser analysis of building materials - practical examples

    SciTech Connect

    Wilsch, G.; Schaurich, D.; Wiggenhauser, H.

    2011-06-23

    The Laser induced Breakdown Spectroscopy (LIBS) is supplement and extension of standard chemical methods and SEM- or Micro-RFA-applications for the evaluation of building materials. As a laboratory method LIBS is used to gain color coded images representing composition, distribution of characteristic ions and/or ingress characteristic of damaging substances. To create a depth profile of element concentration a core has to be taken and split along the core axis. LIBS was proven to be able to detect all important elements in concrete, e. g. Chlorine, Sodium or Sulfur, which are responsible for certain degradation mechanisms and also light elements like lithium or hydrogen. Practical examples are given and a mobile system for on-site measurements is presented.

  6. Scharz Preconditioners for Krylov Methods: Theory and Practice

    SciTech Connect

    Szyld, Daniel B.

    2013-05-10

    Several numerical methods were produced and analyzed. The main thrust of the work relates to inexact Krylov subspace methods for the solution of linear systems of equations arising from the discretization of partial di#11;erential equa- tions. These are iterative methods, i.e., where an approximation is obtained and at each step. Usually, a matrix-vector product is needed at each iteration. In the inexact methods, this product (or the application of a preconditioner) can be done inexactly. Schwarz methods, based on domain decompositions, are excellent preconditioners for thise systems. We contributed towards their under- standing from an algebraic point of view, developed new ones, and studied their performance in the inexact setting. We also worked on combinatorial problems to help de#12;ne the algebraic partition of the domains, with the needed overlap, as well as PDE-constraint optimization using the above-mentioned inexact Krylov subspace methods.

  7. Professional Suitability for Social Work Practice: A Factor Analysis

    ERIC Educational Resources Information Center

    Tam, Dora M. Y.; Coleman, Heather; Boey, Kam-Wing

    2012-01-01

    Objective: The purpose of this study was to identify the underlying dimensions of professional suitability. Method: Data were collected from a province-wide mail-out questionnaire surveying 341 participants from a random sample of registered social workers. Results: The use of an exploratory factor analysis identified a 5-factor solution on…

  8. Dynamic mechanical analysis: A practical introduction to techniques and applications

    SciTech Connect

    Menard, K.

    1999-01-01

    This book introduces DMA, its history, and its current position as part of thermal analysis on polymers. It discusses major types of instrumentation, including oscillatory rotational, oscillatory axial, and torsional pendulum. It also describes analytical techniques in terms of utility, quality of data, methods of calibration, and suitability for different types of materials and assesses applications for thermoplastics, thermosetting systems, and thermosets.

  9. Mixed Methods in Nursing Research : An Overview and Practical Examples

    PubMed Central

    Doorenbos, Ardith Z.

    2014-01-01

    Mixed methods research methodologies are increasingly applied in nursing research to strengthen the depth and breadth of understanding of nursing phenomena. This article describes the background and benefits of using mixed methods research methodologies, and provides two examples of nursing research that used mixed methods. Mixed methods research produces several benefits. The examples provided demonstrate specific benefits in the creation of a culturally congruent picture of chronic pain management for American Indians, and the determination of a way to assess cost for providing chronic pain care. PMID:25580032

  10. Short communication: Practical issues in implementing volatile metabolite analysis for identifying mastitis pathogens.

    PubMed

    Hettinga, Kasper A; de Bok, Frank A M; Lam, Theo J G M

    2015-11-01

    Several parameters for improving volatile metabolite analysis using headspace gas chromatography-mass spectrometry (GC-MS) analysis of volatile metabolites were evaluated in the framework of identification of mastitis-causing pathogens. Previous research showed that the results of such volatile metabolites analysis were comparable with those based on bacteriological culturing. The aim of this study was to evaluate the effect of several method changes on the applicability and potential implementation of this method in practice. The use of a relatively polar column is advantageous, resulting in a faster and less complex chromatographic setup with a higher resolving power yielding higher-quality data. Before volatile metabolite analysis is applied, a minimum incubation of 8h is advised, as reducing incubation time leads to less reliable pathogen identification. Application of GC-MS remained favorable compared with regular gas chromatography. The complexity and cost of a GC-MS system are such that this limits the application of the method in practice for identification of mastitis-causing pathogens. PMID:26342985

  11. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    DOEpatents

    Sanfilippo, Antonio P. (Richland, WA); Cowell, Andrew J. (Kennewick, WA); Gregory, Michelle L. (Richland, WA); Baddeley, Robert L. (Richland, WA); Paulson, Patrick R. (Pasco, WA); Tratz, Stephen C. (Richland, WA); Hohimer, Ryan E. (West Richland, WA)

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  12. An Effective and Practical Method for Solving Hydro-Thermal Unit Commitment Problems Based on Lagrangian Relaxation Method

    NASA Astrophysics Data System (ADS)

    Sakurai, Takayoshi; Kusano, Takashi; Saito, Yutaka; Hirato, Kota; Kato, Masakazu; Murai, Masahiko; Nagata, Junichi

    This paper presents an effective and practical method based on the Lagrangian relaxation method for solving hydro-thermal unit commitment problem in which operational constraints involve spinning reserve requirements for thermal units and prohibition of simultaneous unit start-up/shut-down at the same plant. This method is processed in each iteration step of LRM that enables a direct solution. To improve convergence, this method applies an augmented Lagrangian relaxation method. Its effectiveness demonstrated for a real power system.

  13. Pedagogical Practices and Counselor Self-Efficacy: A Mixed Methods Investigations

    ERIC Educational Resources Information Center

    Brogan, Justin R.

    2009-01-01

    The current study investigated the Lecture Teaching Method and Socratic Teaching Method to determine if there was a relationship between pedagogical methods and Counselor Self-Efficacy (CSE). A course in Advanced Professional Development was utilized to determine if teaching methods could affect student perceptions of competence to practice

  14. Propel: Tools and Methods for Practical Source Code Model Checking

    NASA Technical Reports Server (NTRS)

    Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem

    2003-01-01

    The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

  15. Practical methods for detecting mendacity: a case study.

    PubMed

    Hirsch, A R; Wolf, C J

    2001-01-01

    This study demonstrates the concurrence of the use of objective verbal and nonverbal signs and lying. President Clinton's Grand jury Testimony of August 17, 1998, was examined for the presence of 23 clinically practical signs of dissimulation selected from 64 peer-reviewed articles and 20 books on mendacity. A segment of his testimony that was subsequently found to be false was compared with a control period during the same testimony (internal control). A fund-raising speech to a sympathetic crowd served as a second control (external control). The frequencies of the 23 signs in the mendacious speech were compared with their frequencies during the control periods, and the differences were analyzed for statistical significance. No clinical examination was performed nor diagnosis assigned. During the mendacious speech, the subject markedly increased the frequency of 20 out of 23 signs compared with their frequency during the fund-raising control speech (p < .0005). He increased the frequency of 19 signs compared with their frequency during the control period of the same testimony (p < .003). The 23 signs may be useful as indicators of the veracity of videotaped and scripted testimony. If these findings are confirmed through further testing, they could, with practice, be used by psychiatrists conducting interviews. PMID:11785615

  16. Melanin standard method: titrimetric analysis.

    PubMed

    Zeise, L; Chedekel, M R

    1992-11-01

    Melanin isolated from the ink sac of Sepia officinalis (Sepia melanin) has been proposed as a standard for natural eumelanin, and a standard mild isolation and purification protocol for Sepia melanin has been developed (Zeise, doctoral dissertation, Johns Hopkins University, 1991). The goal of the present work, developed using Sepia melanin, was to quantify the bioavailable carboxylic acid groups present in melanin particles. Bioavailability is governed by the accessibility of carboxy groups to the surrounding biological milieu, and is expressed as microequivalents of carboxy group per gram of melanin. The present work was carried out using an heterogeneous slurry of melanin in a nonaqueous system. A standard acidic titrant, and an automatic titrator operating in an equilibrium titration mode were used to characterize and quantify the carboxy group content of Sepia melanins and several other commonly used melanins purified by a standard method (Zeise et al., Pigment Cell Res. [Suppl] 2:48-53, 1992). PMID:1287626

  17. Methods for Analysis of Outdoor Performance Data (Presentation)

    SciTech Connect

    Jordan, D.

    2011-02-01

    The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted to power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders--utility companies, integrators, investors, and scientists alike. Different methods to determine degradation rates and discrete versus continuous data are presented, and some general best practice methods are outlined. In addition, historical degradation rates and some preliminary analysis with respect to climate are given.

  18. Improved analysis method for rotor stability

    SciTech Connect

    Matsumoto, Iwao; Furukawa, Toyoaki; Wani, Masafumi

    1995-12-31

    In the rotary machine field, the lack of accuracy in the analysis for rotor stability is unacceptable and must be improved. This lack of accuracy is mainly caused by discrepancies between the calculated and measured oil film coefficients. Therefore, the conventional analysis method, in which the Reynolds equation is solved by treating the rotor vibration as a slight whirl and by employing the fixed boundary condition for oil film formation, is flawed. The authors, therefore, re-evaluate the boundary condition for oil film formation, and propose a new analysis method of oil film analysis by considering the real shaft vibration amplitudes. They also compare the oil film coefficients by the proposed method with that by the conventional method.

  19. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  20. Moodtrack : practical methods for assembling emotion-driven music

    E-print Network

    Vercoe, G. Scott

    2006-01-01

    This thesis presents new methods designed for the deconstruction and reassembly of musical works based on a target emotional contour. Film soundtracks provide an ideal testing ground for organizing music around strict ...

  1. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  2. Documenting Elementary Teachers' Sustainability of Instructional Practices: A Mixed Method Case Study

    NASA Astrophysics Data System (ADS)

    Cotner, Bridget A.

    School reform programs focus on making educational changes; however, research on interventions past the funded implementation phase to determine what was sustained is rarely done (Beery, Senter, Cheadle, Greenwald, Pearson, et al., 2005). This study adds to the research on sustainability by determining what instructional practices, if any, of the Teaching SMARTRTM professional development program that was implemented from 2005--2008 in elementary schools with teachers in grades third through eighth were continued, discontinued, or adapted five years post-implementation (in 2013). Specifically, this study sought to answer the following questions: What do teachers who participated in Teaching SMARTRTM and district administrators share about the sustainability of Teaching SMARTRTM practices in 2013? What teaching strategies do teachers who participated in the program (2005--2008) use in their science classrooms five years postimplementation (2013)? What perceptions about the roles of females in science, technology, engineering, and mathematics (STEM) do teachers who participated in the program (2005--2008) have five years later (2013)? And, What classroom management techniques do the teachers who participated in the program (2005--2008) use five years post implementation (2013)? A mixed method approach was used to answer these questions. Quantitative teacher survey data from 23 teachers who participated in 2008 and 2013 were analyzed in SAS v. 9.3. Descriptive statistics were reported and paired t-tests were conducted to determine mean differences by survey factors identified from an exploratory factor analysis, principal axis factoring, and parallel analysis conducted with teacher survey baseline data (2005). Individual teacher change scores (2008 and 2013) for identified factors were computed using the Reliable Change Index statistic. Qualitative data consisted of interviews with two district administrators and three teachers who responded to the survey in both years (2008 and 2013). Additionally, a classroom observation was conducted with one of the interviewed teachers in 2013. Qualitative analyses were conducted following the constant comparative method and were facilitated by ATLAS.ti v. 6.2, a qualitative analysis software program. Qualitative findings identified themes at the district level that influenced teachers' use of Teaching SMARTRTM strategies. All the themes were classified as obstacles to sustainability: economic downturn, turnover of teachers and lack of hiring, new reform policies, such as Race to the Top, Student Success Act, Common Core State Standards, and mandated blocks of time for specific content. Results from the survey data showed no statistically significant difference through time in perceived instructional practices except for a perceived decrease in the use of hands-on instructional activities from 2008 to 2013. Analyses conducted at the individual teacher level found change scores were statistically significant for a few teachers, but overall, teachers reported similarly on the teacher survey at both time points. This sustainability study revealed the lack of facilitating factors to support the continuation of reform practices; however, teachers identified strategies to continue to implement some of the reform practices through time in spite of a number of system-wide obstacles. This sustainability study adds to the literature by documenting obstacles to sustainability in this specific context, which overlap with what is known in the literature. Additionally, the strategies teachers identified to overcome some of the obstacles to implement reform practices and the recommendations by district level administrators add to the literature on how stakeholders may support sustainability of reform through time.

  3. Laboratory theory and methods for sediment analysis

    USGS Publications Warehouse

    Guy, Harold P.

    1969-01-01

    The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.

  4. Theory, Method and Practice of Neuroscientific Findings in Science Education

    ERIC Educational Resources Information Center

    Liu, Chia-Ju; Chiang, Wen-Wei

    2014-01-01

    This report provides an overview of neuroscience research that is applicable for science educators. It first offers a brief analysis of empirical studies in educational neuroscience literature, followed by six science concept learning constructs based on the whole brain theory: gaining an understanding of brain function; pattern recognition and…

  5. Cross-Continental Reflections on Evaluation Practice: Methods, Use, and Valuing

    ERIC Educational Resources Information Center

    Kallemeyn, Leanne M.; Hall, Jori; Friche, Nanna; McReynolds, Clifton

    2015-01-01

    The evaluation theory tree typology reflects the following three components of evaluation practice: (a) methods, (b) use, and (c) valuing. The purpose of this study was to explore how evaluation practice is conceived as reflected in articles published in the "American Journal of Evaluation" ("AJE") and "Evaluation," a…

  6. Causal Moderation Analysis Using Propensity Score Methods

    ERIC Educational Resources Information Center

    Dong, Nianbo

    2012-01-01

    This paper is based on previous studies in applying propensity score methods to study multiple treatment variables to examine the causal moderator effect. The propensity score methods will be demonstrated in a case study to examine the causal moderator effect, where the moderators are categorical and continuous variables. Moderation analysis is an…

  7. [Analysis of heart rate variability. Mathematical description and practical application].

    PubMed

    Sammito, S; Böckelmann, I

    2015-03-01

    The analysis of heart rate variability (HRV) has recently become established as a non-invasive measurement for estimation of demands on the cardiovascular system. The HRV reflects the interaction of the sympathetic and parasympathetic nervous systems and allows the influence of the autonomic nervous system on the regulation of the cardiovascular system to be mathematically described. This review explicates the analysis method of HRV for time, frequency and non-linear methods as well as the range of parameters and the demand on acquisition time. The necessity and possibilities of artefact correction and advice for the selection of a reasonable acquisition period are discussed and standard values for selected HRV parameters are presented. PMID:25298003

  8. Evaluating participatory decision processes: which methods inform reflective practice?

    PubMed

    Kaufman, Sanda; Ozawa, Connie P; Shmueli, Deborah F

    2014-02-01

    Evaluating participatory decision processes serves two key purposes: validating the usefulness of specific interventions for stakeholders, interveners and funders of conflict management processes, and improving practice. However, evaluation design remains challenging, partly because when attempting to serve both purposes we may end up serving neither well. In fact, the better we respond to one, the less we may satisfy the other. Evaluations tend to focus on endogenous factors (e.g., stakeholder selection, BATNAs, mutually beneficial tradeoffs, quality of the intervention, etc.), because we believe that the success of participatory decision processes hinges on them, and they also seem to lend themselves to caeteris paribus statistical comparisons across cases. We argue that context matters too and that contextual differences among specific cases are meaningful enough to undermine conclusions derived solely from comparisons of process-endogenous factors implicitly rooted in the caeteris paribus assumption. We illustrate this argument with an environmental mediation case. We compare data collected about it through surveys geared toward comparability across cases to information elicited through in-depth interviews geared toward case specifics. The surveys, designed by the U.S. Institute of Environmental Conflict Resolution, feed a database of environmental conflicts that can help make the (statistical) case for intervention in environmental conflict management. Our interviews elicit case details - including context - that enable interveners to link context specifics and intervention actions to outcomes. We argue that neither approach can "serve both masters." PMID:24121657

  9. Focus Group Method And Methodology: Current Practice And Recent Debate

    ERIC Educational Resources Information Center

    Parker, Andrew; Tritter, Jonathan

    2006-01-01

    This paper considers the contemporary use of focus groups as a method of data collection within qualitative research settings. The authors draw upon their own experiences of using focus groups in educational and "community" user-group environments in order to provide an overview of recent issues and debates surrounding the deployment of focus…

  10. Practical method of diffusion-welding steel plate in air

    NASA Technical Reports Server (NTRS)

    Holko, K. H.; Moore, T. J.

    1971-01-01

    Method is ideal for critical service requirements where parent metal properties are equaled in notch toughness, stress rupture and other characteristics. Welding technique variations may be used on a variety of materials, such as carbon steels, alloy steels, stainless steels, ceramics, and reactive and refractory materials.

  11. Practice and Progression in Second Language Research Methods

    ERIC Educational Resources Information Center

    Mackey, Alison

    2014-01-01

    Since its inception, the field of second language research has utilized methods from a number of areas, including general linguistics, psychology, education, sociology, anthropology and, recently, neuroscience and corpus linguistics. As the questions and objectives expand, researchers are increasingly pushing methodological boundaries to gain a…

  12. Methods in Educational Research: From Theory to Practice

    ERIC Educational Resources Information Center

    Lodico, Marguerite G.; Spaulding Dean T.; Voegtle, Katherine H.

    2006-01-01

    Written for students, educators, and researchers, "Methods in Educational Research" offers a refreshing introduction to the principles of educational research. Designed for the real world of educational research, the book's approach focuses on the types of problems likely to be encountered in professional experiences. Reflecting the importance of…

  13. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M. (Albuquerque, NM)

    2002-01-01

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  14. Maximizing Return From Sound Analysis and Design Practices

    SciTech Connect

    Bramlette, Judith Lynn

    2002-06-01

    With today's tightening budgets computer applications must provide "true" long-term benefit to the company. Businesses are spending large portions of their budgets "Re- Engineering" old systems to take advantage of "new" technology. But what they are really getting is simply a new interface implementing the same incomplete or poor defined requirements as before. "True" benefit can only be gained if sound analysis and design practices are used. WHAT data and processes are required of a system is not the same as HOW the system will be implemented within a company. It is the System Analyst's responsibility to understand the difference between these two concepts. The paper discusses some simple techniques to be used during the Analysis and Design phases of projects, as well as the information gathered and recorded in each phase and how it is transformed between these phases. The paper also covers production application generated using Oracle Designer. Applying these techniques to "real world" problems, the applications will meet the needs for today's business and adapt easily to ever-changing business environments.

  15. Maximizing Return From Sound Analysis and Design Practices

    SciTech Connect

    Bramlette, J.D.

    2002-04-22

    With today's tightening budgets computer applications must provide ''true'' long-term benefit to the company. Businesses are spending large portions of their budgets ''Re-Engineering'' old systems to take advantage of ''new'' technology. But what they are really getting is simply a new interface implementing the same incomplete or poor defined requirements as before. ''True'' benefit can only be gained if sound analysis and design practices are used. WHAT data and processes are required of a system is not the same as HOW the system will be implemented within a company. It is the System Analyst's responsibility to understand the difference between these two concepts. The paper discusses some simple techniques to be used during the Analysis and Design phases of projects, as well as the information gathered and recorded in each phase and how it is transformed between these phases. The paper also covers production application generated using Oracle Designer. Applying these techniques to ''real world'' problems, the applications will meet the needs for today's business and adapt easily to ever-changing business environments.

  16. Method of analysis and quality-assurance practices by the U.S. Geological Survey Organic Geochemistry Research Group; determination of geosmin and methylisoborneol in water using solid-phase microextraction and gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Zimmerman, L.R.; Ziegler, A.C.; Thurman, E.M.

    2002-01-01

    A method for the determination of two common odor-causing compounds in water, geosmin and 2-methylisoborneol, was modified and verified by the U.S. Geological Survey's Organic Geochemistry Research Group in Lawrence, Kansas. The optimized method involves the extraction of odor-causing compounds from filtered water samples using a divinylbenzene-carboxen-polydimethylsiloxane cross-link coated solid-phase microextraction (SPME) fiber. Detection of the compounds is accomplished using capillary-column gas chromatography/mass spectrometry (GC/MS). Precision and accuracy were demonstrated using reagent-water, surface-water, and ground-water samples. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 35 nanograms per liter ranged from 60 to 123 percent for geosmin and from 90 to 96 percent for 2-methylisoborneol. Method detection limits were 1.9 nanograms per liter for geosmin and 2.0 nanograms per liter for 2-methylisoborneol in 45-milliliter samples. Typically, concentrations of 30 and 10 nanograms per liter of geosmin and 2-methylisoborneol, respectively, can be detected by the general public. The calibration range for the method is equivalent to concentrations from 5 to 100 nanograms per liter without dilution. The method is valuable for acquiring information about the production and fate of these odor-causing compounds in water.

  17. Understanding Online Teacher Best Practices: A Thematic Analysis to Improve Learning

    ERIC Educational Resources Information Center

    Corry, Michael; Ianacone, Robert; Stella, Julie

    2014-01-01

    The purpose of this study was to examine brick-and-mortar and online teacher best practice themes using thematic analysis and a newly developed theory-based analytic process entitled Synthesized Thematic Analysis Criteria (STAC). The STAC was developed to facilitate the meaningful thematic analysis of research based best practices of K-12…

  18. Comparative analysis of the spatial analysis methods for hotspot identification.

    PubMed

    Yu, Hao; Liu, Pan; Chen, Jun; Wang, Hao

    2014-05-01

    Spatial analysis technique has been introduced as an innovative approach for hazardous road segments identification (HRSI). In this study, the performance of two spatial analysis methods and four conventional methods for HRSI was compared against three quantitative evaluation criteria. The spatial analysis methods considered in this study include the local spatial autocorrelation method and the kernel density estimation (KDE) method. It was found that the empirical Bayesian (EB) method and the KDE method outperformed other HRSI approaches. By transferring the kernel density function into a form that was analogous to the form of the EB function, we further proved that the KDE method can eventually be considered a simplified version of the EB method in which crashes reported at neighboring spatial units are used as the reference population for estimating the EB-adjusted crashes. Theoretically, the KDE method may outperform the EB method in HRSI when the neighboring spatial units provide more useful information on the expected crash frequency than a safety performance function does. PMID:24530515

  19. Protein-protein interactions: methods for detection and analysis.

    PubMed Central

    Phizicky, E M; Fields, S

    1995-01-01

    The function and activity of a protein are often modulated by other proteins with which it interacts. This review is intended as a practical guide to the analysis of such protein-protein interactions. We discuss biochemical methods such as protein affinity chromatography, affinity blotting, coimmunoprecipitation, and cross-linking; molecular biological methods such as protein probing, the two-hybrid system, and phage display: and genetic methods such as the isolation of extragenic suppressors, synthetic mutants, and unlinked noncomplementing mutants. We next describe how binding affinities can be evaluated by techniques including protein affinity chromatography, sedimentation, gel filtration, fluorescence methods, solid-phase sampling of equilibrium solutions, and surface plasmon resonance. Finally, three examples of well-characterized domains involved in multiple protein-protein interactions are examined. The emphasis of the discussion is on variations in the approaches, concerns in evaluating the results, and advantages and disadvantages of the techniques. PMID:7708014

  20. "Movement Doesn't Lie": Teachers' Practice Choreutical Analysis

    ERIC Educational Resources Information Center

    Pastore, Serafina; Pentassuglia, Monica

    2015-01-01

    Identifying and describing teaching practice is not an easy task. Current educational research aims at explaining teachers' work focusing on the concept of practice. Teachers' practical knowledge is a sensitive and tacit knowledge, produced, and effused by the body. In this perspective, the teachers' work can be considered as an expressive…

  1. Bioanalytical methods for food contaminant analysis.

    PubMed

    Van Emon, Jeanette M

    2010-01-01

    Foods are complex mixtures of lipids, carbohydrates, proteins, vitamins, organic compounds, and other naturally occurring substances. Sometimes added to this mixture are residues of pesticides, veterinary and human drugs, microbial toxins, preservatives, contaminants from food processing and packaging, and other residues. This milieu of compounds can pose difficulties in the analysis of food contaminants. There is an expanding need for rapid and cost-effective residue methods for difficult food matrixes to safeguard our food supply. Bioanalytical methods are established for many food contaminants such as mycotoxins and are the method of choice for many food allergens. Bioanalytical methods are often more cost-effective and sensitive than instrumental procedures. Recent developments in bioanalytical methods may provide more applications for their use in food analysis. PMID:21313795

  2. Method and apparatus for ceramic analysis

    DOEpatents

    Jankowiak, Ryszard J. (Ames, IA); Schilling, Chris (Ames, IA); Small, Gerald J. (Ames, IA); Tomasik, Piotr (Cracow, PL)

    2003-04-01

    The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.

  3. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    SciTech Connect

    Ronald Laurids Boring

    2010-11-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  4. Practical Methods for Locating Abandoned Wells in Populated Areas

    SciTech Connect

    Veloski, G.A.; Hammack, R.W.; Lynn, R.J.

    2007-09-01

    An estimated 12 million wells have been drilled during the 150 years of oil and gas production in the United States. Many old oil and gas fields are now populated areas where the presence of improperly plugged wells may constitute a hazard to residents. Natural gas emissions from wells have forced people from their houses and businesses and have caused explosions that injured or killed people and destroyed property. To mitigate this hazard, wells must be located and properly plugged, a task made more difficult by the presence of houses, businesses, and associated utilities. This paper describes well finding methods conducted by the National Energy Technology Laboratory (NETL) that were effective at two small towns in Wyoming and in a suburb of Pittsburgh, Pennsylvania.

  5. An Overview of Longitudinal Data Analysis Methods for Neurological Research

    PubMed Central

    Locascio, Joseph J.; Atri, Alireza

    2011-01-01

    The purpose of this article is to provide a concise, broad and readily accessible overview of longitudinal data analysis methods, aimed to be a practical guide for clinical investigators in neurology. In general, we advise that older, traditional methods, including (1) simple regression of the dependent variable on a time measure, (2) analyzing a single summary subject level number that indexes changes for each subject and (3) a general linear model approach with a fixed-subject effect, should be reserved for quick, simple or preliminary analyses. We advocate the general use of mixed-random and fixed-effect regression models for analyses of most longitudinal clinical studies. Under restrictive situations or to provide validation, we recommend: (1) repeated-measure analysis of covariance (ANCOVA), (2) ANCOVA for two time points, (3) generalized estimating equations and (4) latent growth curve/structural equation models. PMID:22203825

  6. Simplified Analysis Methods for Primary Load Designs at Elevated Temperatures

    SciTech Connect

    Carter, Peter; Jetter, Robert I; Sham, Sam

    2011-01-01

    The use of simplified (reference stress) analysis methods is discussed and illustrated for primary load high temperature design. Elastic methods are the basis of the ASME Section III, Subsection NH primary load design procedure. There are practical drawbacks with this approach, particularly for complex geometries and temperature gradients. The paper describes an approach which addresses these difficulties through the use of temperature-dependent elastic-perfectly plastic analysis. Correction factors are defined to address difficulties traditionally associated with discontinuity stresses, inelastic strain concentrations and multiaxiality. A procedure is identified to provide insight into how this approach could be implemented but clearly there is additional work to be done to define and clarify the procedural steps to bring it to the point where it could be adapted into code language.

  7. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2004-03-23

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  8. Problems in the fingerprints based polycyclic aromatic hydrocarbons source apportionment analysis and a practical solution.

    PubMed

    Zou, Yonghong; Wang, Lixia; Christensen, Erik R

    2015-10-01

    This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). PMID:26208321

  9. Advanced analysis methods in particle physics

    SciTech Connect

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  10. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2010-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  11. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2009-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  12. A practical method of estimating standard error of age in the fission track dating method

    USGS Publications Warehouse

    Johnson, N.M.; McGee, V.E.; Naeser, C.W.

    1979-01-01

    A first-order approximation formula for the propagation of error in the fission track age equation is given by PA = C[P2s+P2i+P2??-2rPsPi] 1 2, where PA, Ps, Pi and P?? are the percentage error of age, of spontaneous track density, of induced track density, and of neutron dose, respectively, and C is a constant. The correlation, r, between spontaneous are induced track densities is a crucial element in the error analysis, acting generally to improve the standard error of age. In addition, the correlation parameter r is instrumental is specifying the level of neutron dose, a controlled variable, which will minimize the standard error of age. The results from the approximation equation agree closely with the results from an independent statistical model for the propagation of errors in the fission-track dating method. ?? 1979.

  13. Drosophila hematopoiesis: markers and methods for molecular genetic analysis

    PubMed Central

    Evans, Cory J.; Liu, Ting; Banerjee, Utpal

    2014-01-01

    Analyses of the Drosophila hematopoietic system are becoming more and more prevalent as developmental and functional parallels with vertebrate blood cells become more evident. Investigative work on the fly blood system has, out of necessity, led to the identification of new molecular markers for blood cell types and lineages and to the refinement of useful molecular genetic tools and analytical methods. This review briefly describes the Drosophila hematopoietic system at different developmental stages, summarizes the major useful cell markers and tools for each stage, and provides basic protocols for practical analysis of circulating blood cells and of the lymph gland, the larval hematopoietic organ. PMID:24613936

  14. Chromatographic methods for analysis of triazine herbicides.

    PubMed

    Abbas, Hana Hassan; Elbashir, Abdalla A; Aboul-Enein, Hassan Y

    2015-01-01

    Gas chromatography (GC) and high-performance liquid chromatography (HPLC) coupled to different detectors, and in combination with different sample extraction methods, are most widely used for analysis of triazine herbicides in different environmental samples. Nowadays, many variations and modifications of extraction and sample preparation methods such as solid-phase microextraction (SPME), hollow fiber-liquid phase microextraction (HF-LPME), stir bar sportive extraction (SBSE), headspace-solid phase microextraction (HS-SPME), dispersive liquid-liquid microextraction (DLLME), dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO), ultrasound-assisted emulsification microextraction (USAEME), and others have been introduced and developed to obtain sensitive and accurate methods for the analysis of these hazardous compounds. In this review, several analytical properties such as linearity, sensitivity, repeatability, and accuracy for each developed method are discussed, and excellent results were obtained for the most of developed methods combined with GC and HPLC techniques for the analysis of triazine herbicides. This review gives an overview of recent publications of the application of GC and HPLC for analysis of triazine herbicides residues in various samples. PMID:25849823

  15. Measuring solar reflectance - Part II: Review of practical methods

    SciTech Connect

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul

    2010-09-15

    A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23 ], and to within 0.02 for surface slopes up to 12:12 [45 ]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R{sub g,0}{sup *}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R{sub g,0}{sup *} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R{sub g,0}{sup *} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R{sub g,0}{sup *} to within about 0.01. (author)

  16. Measuring solar reflectance Part II: Review of practical methods

    SciTech Connect

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul

    2010-05-14

    A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23{sup o}], and to within 0.02 for surface slopes up to 12:12 [45{sup o}]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R*{sub g,0}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R*{sub g,0} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R*{sub g,0} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R*{sub g,0} to within about 0.01.

  17. Seasonal dimensions to rural porverty: analysis and practical implications.

    PubMed

    Chambers, R; Longhurst, R; Bradley, D; Feachem, R

    1979-08-01

    This paper reports on a conference on seasonal dimensions to rural poverty. Presentations included specialised papers on climate, energy balance, vital events, individual tropical diseases, nutrition, rural economy, and women, and also multi-disciplinary case studies of tropical rural areas from the Gambia, Nigeria, Mali, Kenya, Tanzania, India and Bangladesh. While care is needed in generalising, the evidence suggested that for agriculturalists in the tropics, the worst times of year are the wet seasons, typically marked by a concurrence of food shortages, high demands for agricultural work, high exposure to infection especially diarrhoeas, malaria, and skin diseases, loss of body weight, low birth weights, high neonatal mortality, poor child care, malnutrition, sickness and indebtedness. In this season, poor and weak people, especially women, are vulnerable to deprivation and to becoming poorer and weaker. Seasonal analysis is easily left out in rural planning. When applied, it suggests priorities in research, and indicates practical policy measures for health, for the family, for agriculture, and for government planning and administration. PMID:537128

  18. Iterative methods for design sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Belegundu, A. D.; Yoon, B. G.

    1989-01-01

    A numerical method is presented for design sensitivity analysis, using an iterative-method reanalysis of the structure generated by a small perturbation in the design variable; a forward-difference scheme is then employed to obtain the approximate sensitivity. Algorithms are developed for displacement and stress sensitivity, as well as for eignevalues and eigenvector sensitivity, and the iterative schemes are modified so that the coefficient matrices are constant and therefore decomposed only once.

  19. Design analysis, robust methods, and stress classification

    SciTech Connect

    Bees, W.J.

    1993-01-01

    This special edition publication volume is comprised of papers presented at the 1993 ASME Pressure Vessels and Piping Conference, July 25--29, 1993 in Denver, Colorado. The papers were prepared for presentations in technical sessions developed under the auspices of the PVPD Committees on Computer Technology, Design and Analysis, Operations Applications and Components. The topics included are: Analysis of Pressure Vessels and Components; Expansion Joints; Robust Methods; Stress Classification; and Non-Linear Analysis. Individual papers have been processed separately for inclusion in the appropriate data bases.

  20. Thermal Analysis of AC Contactor Using Thermal Network Finite Difference Analysis Method

    NASA Astrophysics Data System (ADS)

    Niu, Chunping; Chen, Degui; Li, Xingwen; Geng, Yingsan

    To predict the thermal behavior of switchgear quickly, the Thermal Network Finite Difference Analysis method (TNFDA) is adopted in thermal analysis of AC contactor in the paper. The thermal network model is built with nodes, thermal resistors and heat generators, and it is solved using finite difference method (FDM). The main circuit and the control system are connected by thermal resistors network, which solves the problem of multi-sources interaction in the application of TNFDA. The temperature of conducting wires is calculated according to the heat transfer process and the fundamental equations of thermal conduction. It provides a method to solve the problem of boundary conditions in applying the TNFDA. The comparison between the results of TNFDA and measurements shows the feasibility and practicability of the method.

  1. Contraceptive Method Initiation: Using the Centers for Disease Control and Prevention Selected Practice Guidelines.

    PubMed

    Wu, Wan-Ju; Edelman, Alison

    2015-12-01

    The US Selected Practice Recommendations is a companion document to the Medical Eligibility Criteria for Contraceptive Use that focuses on how providers can use contraceptive methods most effectively as well as problem-solve common issues that may arise. These guidelines serve to help clinicians provide contraception safely as well as to decrease barriers that prevent or delay a woman from obtaining a desired method. This article summarizes the Selected Practice Recommendations on timing of contraceptive initiation, examinations, and tests needed prior to starting a method and any necessary follow-up. PMID:26598307

  2. [Application of modern thermal methods in pharmaceutical analysis].

    PubMed

    Harmathy, Z; Konkoly Thege, I

    1994-01-01

    Main applications of modern thermal methods in the pharmaceutical analysis are reviewed. While these methods are widely employed in pharmaceutical industry, they are only recommended--but not prescribed--in the general part of the VIIth Pharmacopoea Hungarica and consequently, this field is practically unknown for the pharmacists. The aim of this paper is to call attention to the advantage of the application of modern thermal methods (DTA, TG, DSC) in contrast to the classical ones. They permit a quick and accurate measurement of the melting point, the determination of purity and eutectic point, to clear up the possibility of conditions for the development of polymorphic modifications, and to follow the process of loosing the water of crystallization. PMID:8023686

  3. Diatoms in forensic analysis: A practical approach in rats.

    PubMed

    Badu, Isaac K; Girela, Eloy; Beltrán, Cristina M; Ruz-Caracuel, Ignacio; Jimena, Ignacio

    2015-07-01

    A diagnosis of drowning is a challenge in legal medicine, as there is generally a lack of pathognomonic findings indicative of drowning. Diatom analysis has been considered very supportive for a diagnosis of drowning, although the test is still controversial for some investigators. We assessed diatoms association with drowning in the peripheral tissues of drowned rats and the effects of the drowning medium on the diatom yield. A modified acid digestion method was optimised for diatom recovery in water and rat tissues. Eighteen adult Wistar rats were employed for the study, subdivided into six groups of three rats. Groups 1, 3 and 5 were drowned in seawater, lake water, or river water respectively, while groups 2, 4 and 6 were controls (immersed after death in seawater, lake water or river water respectively). Water samples were taken from the sea, lake and river in Málaga and Córdoba (Spain) for the purposes of diatomological mapping and drowning of the rats. Diatoms were successfully recovered from all water samples and matched with tissues of the drowned rats. There were significant differences in diatom numbers between control and test samples for all the tissues studied, as well as within test samples. Histological investigations conducted on lung samples obtained from drowned rats provided complementary and valuable information. This study demonstrates the feasibility of the diatom test as a reliable method for the diagnosis of drowning, especially if adequate precautions are taken to avoid contamination, and if interpretation of the analysis is performed in light of other complementary investigations. PMID:24966336

  4. Best Practices in Teaching Statistics and Research Methods in the Behavioral Sciences [with CD-ROM

    ERIC Educational Resources Information Center

    Dunn, Dana S., Ed.; Smith, Randolph A., Ed.; Beins, Barney, Ed.

    2007-01-01

    This book provides a showcase for "best practices" in teaching statistics and research methods in two- and four-year colleges and universities. A helpful resource for teaching introductory, intermediate, and advanced statistics and/or methods, the book features coverage of: (1) ways to integrate these courses; (2) how to promote ethical conduct;…

  5. Proceedings of the First International Workshop on Activity Theory Based Practical Methods for IT-Design

    E-print Network

    Bertelsen, Olav W.

    ATIT 2004 Proceedings of the First International Workshop on Activity Theory Based Practical was to discus and refine methods for IT design based on activity theory. Thereby, stimulate the evolution for practitioners, and a short paper, reflecting on the method, its basis in activity theory, its history, its use

  6. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M. (Philadelphia, TN); Ng, Esmond G. (Concord, TN)

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  7. Methods for Chemical Analysis of Fresh Waters.

    ERIC Educational Resources Information Center

    Golterman, H. L.

    This manual, one of a series prepared for the guidance of research workers conducting studies as part of the International Biological Programme, contains recommended methods for the analysis of fresh water. The techniques are grouped in the following major sections: Sample Taking and Storage; Conductivity, pH, Oxidation-Reduction Potential,…

  8. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, L.M.; Ng, E.G.

    1998-09-29

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

  9. Systems and methods for sample analysis

    SciTech Connect

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-10-20

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  10. Systems and methods for sample analysis

    DOEpatents

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-01-13

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  11. Analysis of nonstandard and home-made explosives and post-blast residues in forensic practice

    NASA Astrophysics Data System (ADS)

    Kotrlý, Marek; Turková, Ivana

    2014-05-01

    Nonstandard and home-made explosives may constitute a considerable threat and as well as a potential material for terrorist activities. Mobile analytical devices, particularly Raman, or also FTIR spectrometers are used for the initial detection. Various sorts of phlegmatizers (moderants) to decrease sensitivity of explosives were tested, some kinds of low viscosity lubricants yielded very good results. If the character of the substance allows it, phlegmatized samples are taken in the amount of approx.0.3g for a laboratory analysis. Various separation methods and methods of concentrations of samples from post-blast scenes were tested. A wide range of methods is used for the laboratory analysis. XRD techniques capable of a direct phase identification of the crystalline substance, namely in mixtures, have highly proved themselves in practice for inorganic and organic phases. SEM-EDS/WDS methods are standardly employed for the inorganic phase. In analysing post-blast residues, there are very important techniques allowing analysis at the level of separate particles, not the overall composition in a mixed sample.

  12. Power System Transient Stability Analysis through a Homotopy Analysis Method

    SciTech Connect

    Wang, Shaobu; Du, Pengwei; Zhou, Ning

    2014-04-01

    As an important function of energy management systems (EMSs), online contingency analysis plays an important role in providing power system security warnings of instability. At present, N-1 contingency analysis still relies on time-consuming numerical integration. To save computational cost, the paper proposes a quasi-analytical method to evaluate transient stability through time domain periodic solutions’ frequency sensitivities against initial values. First, dynamic systems described in classical models are modified into damping free systems whose solutions are either periodic or expanded (non-convergent). Second, because the sensitivities experience sharp changes when periodic solutions vanish and turn into expanded solutions, transient stability is assessed using the sensitivity. Third, homotopy analysis is introduced to extract frequency information and evaluate the sensitivities only from initial values so that time consuming numerical integration is avoided. Finally, a simple case is presented to demonstrate application of the proposed method, and simulation results show that the proposed method is promising.

  13. Multiple predictor smoothing methods for sensitivity analysis.

    SciTech Connect

    Helton, Jon Craig; Storlie, Curtis B.

    2006-08-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  14. Digital Forensics Analysis of Spectral Estimation Methods

    E-print Network

    Mataracioglu, Tolga

    2011-01-01

    Steganography is the art and science of writing hidden messages in such a way that no one apart from the intended recipient knows of the existence of the message. In today's world, it is widely used in order to secure the information. In this paper, the traditional spectral estimation methods are introduced. The performance analysis of each method is examined by comparing all of the spectral estimation methods. Finally, from utilizing those performance analyses, a brief pros and cons of the spectral estimation methods are given. Also we give a steganography demo by hiding information into a sound signal and manage to pull out the information (i.e, the true frequency of the information signal) from the sound by means of the spectral estimation methods.

  15. Cask crush pad analysis using detailed and simplified analysis methods

    SciTech Connect

    Uldrich, E.D.; Hawkes, B.D.

    1997-12-31

    A crush pad has been designed and analyzed to absorb the kinetic energy of a hypothetically dropped spent nuclear fuel shipping cask into a 44-ft. deep cask unloading pool at the Fluorinel and Storage Facility (FAST). This facility, located at the Idaho Chemical Processing Plant (ICPP) at the Idaho national Engineering and Environmental Laboratory (INEEL), is a US Department of Energy site. The basis for this study is an analysis by Uldrich and Hawkes. The purpose of this analysis was to evaluate various hypothetical cask drop orientations to ensure that the crush pad design was adequate and the cask deceleration at impact was less than 100 g. It is demonstrated herein that a large spent fuel shipping cask, when dropped onto a foam crush pad, can be analyzed by either hand methods or by sophisticated dynamic finite element analysis using computer codes such as ABAQUS. Results from the two methods are compared to evaluate accuracy of the simplified hand analysis approach.

  16. Common Goals for the Science and Practice of Behavior Analysis: A Response to Critchfield

    ERIC Educational Resources Information Center

    Schneider, Susan M.

    2012-01-01

    In his scholarly and thoughtful article, "Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis," Critchfield (2011) discussed the science-practice frictions to be expected in any professional organization that attempts to combine these interests. He suggested that the Association for Behavior Analysis

  17. Situational Analysis: Centerless Systems and Human Service Practices

    ERIC Educational Resources Information Center

    Newbury, Janet

    2011-01-01

    Bronfenbrenner's ecological model is a conceptual framework that continues to contribute to human service practices. In the current article, the author describes the possibilities for practice made intelligible by drawing from this framework. She then explores White's "Web of Praxis" model as an important extension of this approach, and proceeds…

  18. Researching "Practiced Language Policies": Insights from Conversation Analysis

    ERIC Educational Resources Information Center

    Bonacina-Pugh, Florence

    2012-01-01

    In language policy research, "policy" has traditionally been conceptualised as a notion separate from that of "practice". In fact, language practices were usually analysed with a view to evaluate whether a policy is being implemented or resisted to. Recently, however, Spolsky in ("Language policy". Cambridge University press, Cambridge, 2004;…

  19. Graphical methods for the sensitivity analysis in discriminant analysis

    SciTech Connect

    Kim, Youngil; Anderson-Cook, Christine M.; Dae-Heung, Jang

    2015-09-30

    Similar to regression, many measures to detect influential data points in discriminant analysis have been developed. Many follow similar principles as the diagnostic measures used in linear regression in the context of discriminant analysis. Here we focus on the impact on the predicted classification posterior probability when a data point is omitted. The new method is intuitive and easily interpretative compared to existing methods. We also propose a graphical display to show the individual movement of the posterior probability of other data points when a specific data point is omitted. This enables the summaries to capture the overall pattern of the change.

  20. Graphical methods for the sensitivity analysis in discriminant analysis

    DOE PAGESBeta

    Kim, Youngil; Anderson-Cook, Christine M.; Dae-Heung, Jang

    2015-09-30

    Similar to regression, many measures to detect influential data points in discriminant analysis have been developed. Many follow similar principles as the diagnostic measures used in linear regression in the context of discriminant analysis. Here we focus on the impact on the predicted classification posterior probability when a data point is omitted. The new method is intuitive and easily interpretative compared to existing methods. We also propose a graphical display to show the individual movement of the posterior probability of other data points when a specific data point is omitted. This enables the summaries to capture the overall pattern ofmore »the change.« less

  1. Method of analysis and quality-assurance practices for determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry at the U.S. Geological Survey California District Organic Chemistry Laboratory, 1996-99

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Baker, Lucian M.; Kuivila, Kathryn M.

    2000-01-01

    A method of analysis and quality-assurance practices were developed to study the fate and transport of pesticides in the San Francisco Bay-Estuary by the U.S. Geological Survey. Water samples were filtered to remove suspended-particulate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide and the pesticides were eluted with three cartridge volumes of hexane:diethyl ether (1:1) solution. The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for pesticides ranged from 0.002 to 0.025 microgram per liter for 1-liter samples. Recoveries ranged from 44 to 140 percent for 25 pesticides in samples of organic-free reagent water and Sacramento-San Joaquin Delta and Suisun Bay water fortified at 0.05 and 0.50 microgram per liter. The estimated holding time for pesticides after extraction on C-8 solid-phase extraction cartridges ranged from 10 to 257 days.

  2. A Mixed Methods Content Analysis of the Research Literature in Science Education

    ERIC Educational Resources Information Center

    Schram, Asta B.

    2014-01-01

    In recent years, more and more researchers in science education have been turning to the practice of combining qualitative and quantitative methods in the same study. This approach of using mixed methods creates possibilities to study the various issues that science educators encounter in more depth. In this content analysis, I evaluated 18…

  3. Analysis and Application of Potential Energy Smoothing and Search Methods for Global Optimization

    E-print Network

    Ponder, Jay

    Analysis and Application of Potential Energy Smoothing and Search Methods for Global Optimization levels of smoothing. PSS methods should serve as useful tools for global energy optimization on a variety of difficult problems of practical interest. Introduction Global optimization is an important issue

  4. Text analysis devices, articles of manufacture, and text analysis methods

    SciTech Connect

    Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C

    2013-05-28

    Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.

  5. A Comparative Analysis of Ethnomedicinal Practices for Treating Gastrointestinal Disorders Used by Communities Living in Three National Parks (Korea)

    PubMed Central

    Kim, Hyun; Song, Mi-Jang; Brian, Heldenbrand; Choi, Kyoungho

    2014-01-01

    The purpose of this study is to comparatively analyze the ethnomedicinal practices on gastrointestinal disorders within communities in Jirisan National Park, Gayasan National Park, and Hallasan National Park of Korea. Data was collected through participant observations and indepth interviews with semistructured questionnaires. Methods for comparative analysis were accomplished using the informant consensus factor, fidelity level, and internetwork analysis. A total of 490 ethnomedicinal practices recorded from the communities were classified into 110 families, 176 genera, and 220 species that included plants, animals, fungi, and alga. The informant consensus factor values in the disorder categories were enteritis, and gastralgia (1.0), followed by indigestion (0.94), constipation (0.93), and abdominal pain and gastroenteric trouble (0.92). In terms of fidelity levels, 71 plant species showed fidelity levels of 100%. The internetwork analysis between disorders and all medicinal species are grouped in the center by the four categories of indigestion, diarrhea, abdominal pain, and gastroenteric trouble, respectively. Regarding the research method of this study, the comparative analysis methods will contribute to the availability of orally transmitted ethnomedicinal knowledge. Among the methods of analysis, the use of internetwork analysis as a tool for analysis in this study provides imperative internetwork maps between gastrointestinal disorders and medicinal species. PMID:25202330

  6. The practice patterns of second trimester fetal ultrasonography: A questionnaire survey and an analysis of checklists

    PubMed Central

    Park, Hyun Soo; Hong, Joon Seok; Seol, Hyun-Joo; Hwang, Han Sung; Kim, Kunwoo; Ko, Hyun Sun; Kwak, Dong-Wook; Oh, Soo-young; Kim, Moon Young; Kim, Sa Jin

    2015-01-01

    Objective To analyze practice patterns and checklists of second trimester ultrasonography, and to investigate management plans when soft markers are detected among Korean Society of Ultrasound in Obstetrics and Gynecology (KSUOG) members. Methods An internet-based self-administered questionnaire survey was designed. KSUOG members were invited to the survey. Checklists of the second trimester ultrasonography were also requested. In the questionnaire survey, general practice patterns of the second trimester ultrasonography and management schemes of soft markers were asked. In the checklists analysis, the number of items were counted and also compared with those recommended by other medical societies. Results A total of 101 members responded. Eighty-seven percent routinely recommended second trimester fetal anatomic surveillance. Most (91.1%) performed it between 20+0 and 23+6 weeks of gestation. Written informed consents were given by 15.8% of respondents. Nearly 60% recommended genetic counseling when multiple soft markers and/or advanced maternal age were found. Similar tendencies were found in the managements of individual soft markers. However, practice patterns were very diverse and sometimes conflicting. Forty-eight checklists were analyzed in context with the number and content of the items. The median item number was 46.5 (range, 17 to 109). Of 49 items of checklists recommended by International Society of Ultrasound in Obstetrics and Gynecology and/or American Congress of Obstetricians and Gynecologists, 14 items (28.6%) were found in less than 50% of the checklists analyzed in this study. Conclusion Although general practice patterns were similar among KSUOG members, some of which were conflicting, and there is a need for standardization of the practice patterns and checklists of second trimester ultrasonography, which also have very wide range of spectrum. PMID:26623407

  7. Effectiveness of a Motivation and Practical Skills Development Methods on the Oral Hygiene of Orphans Children in Kaunas, Lithuania

    PubMed Central

    Narbutaite, Julija

    2015-01-01

    ABSTRACT Objectives The aim of this study was to evaluate the effect of a motivation and practical skills development methods on the oral hygiene of orphans. Material and Methods Sixty eight orphans aged between 7 and 17 years from two orphanages in Kaunas were divided into two groups: practical application group and motivation group. Children were clinically examined by determining their oral hygiene status using Silness-Löe plaque index. Questionnaire was used to estimate the oral hygiene knowledge and practices at baseline and after 3 months. Statistical analysis included: Chi-square test (?2), Fisher‘s exact test, Student‘s t-test, nonparametric Mann-Whitney test, Spearman’s rho correlation coefficient and Kappa coefficient. Results All children had a plaque on at least one tooth in both groups: motivation 1.14 (SD 0.51), practical application 1.08 (SD 0.4) (P = 0.58). Girls in both groups showed significantly better oral hygiene than boys (P < 0.001). After 3 months educational program oral hygiene status improved in both groups significantly 0.4 (SD 0.35) (P < 0.001). Significantly better oral hygiene was determined in practical application group 0.19 (SD 0.27) in comparison with motivation group 0.55 (SD 0.32) (P < 0.001). By comparing results of first and second questionnaire surveys on use of soft drinks, the statistically significant decline of their use was in both groups (P = 0.004). Conclusions Educational programs are effective in improving oral hygiene, especially when they’re based on practical skills training. PMID:26539284

  8. Degradation of learned skills: Effectiveness of practice methods on visual approach and landing skill retention

    NASA Technical Reports Server (NTRS)

    Sitterley, T. E.; Zaitzeff, L. P.; Berge, W. A.

    1972-01-01

    Flight control and procedural task skill degradation, and the effectiveness of retraining methods were evaluated for a simulated space vehicle approach and landing under instrument and visual flight conditions. Fifteen experienced pilots were trained and then tested after 4 months either without the benefits of practice or with static rehearsal, dynamic rehearsal or with dynamic warmup practice. Performance on both the flight control and procedure tasks degraded significantly after 4 months. The rehearsal methods effectively countered procedure task skill degradation, while dynamic rehearsal or a combination of static rehearsal and dynamic warmup practice was required for the flight control tasks. The quality of the retraining methods appeared to be primarily dependent on the efficiency of visual cue reinforcement.

  9. A practical algorithm for static analysis of parallel programs

    SciTech Connect

    McDowell, C.E. )

    1989-06-01

    One approach to analyzing the behavior of a concurrent program requires determining the reachable program states. A program state consists of a set of task states, the values of shared variables used for synchronization, and local variables that derive the values directly from synchronization operations. However, the number of reachable states rises exponentially with the number of tasks and becomes intractable for many concurrent programs. A variation of this approach merges a set of related states into a single virtual state. Using this approach, the analysis of concurrent programs becomes feasible as the number of virtual states is often orders of magnitude less than the number of reachable states. This paper presents a method for determining the virtual states that describe the reachable program states, and the reduction in the number of states is analyzed. The algorithms given have been implemented in a state program analyzer for multitasking Fortran, and the results obtained are discussed.

  10. Practical post-calibration uncertainty analysis: Yucca Mountain, Nevada, USA

    NASA Astrophysics Data System (ADS)

    James, S. C.; Doherty, J.; Eddebbarh, A.

    2009-12-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is “calibrated.” Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the US’s proposed site for disposal of high-level radioactive waste. Both of these types uncertainty analysis are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction’s uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This paper applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. Furthermore, Monte Carlo simulations confirm that hydrogeologic units thought to be flow barriers have probability distributions skewed toward lower permeabilities.

  11. Advanced Analysis Methods in High Energy Physics

    SciTech Connect

    Pushpalatha C. Bhat

    2001-10-03

    During the coming decade, high energy physics experiments at the Fermilab Tevatron and around the globe will use very sophisticated equipment to record unprecedented amounts of data in the hope of making major discoveries that may unravel some of Nature's deepest mysteries. The discovery of the Higgs boson and signals of new physics may be around the corner. The use of advanced analysis techniques will be crucial in achieving these goals. The author discusses some of the novel methods of analysis that could prove to be particularly valuable for finding evidence of any new physics, for improving precision measurements and for exploring parameter spaces of theoretical models.

  12. New Regularization Method for EXAFS Analysis

    SciTech Connect

    Reich, Tatiana Ye.; Reich, Tobias; Korshunov, Maxim E.; Antonova, Tatiana V.; Ageev, Alexander L.; Moll, Henry

    2007-02-02

    As an alternative to the analysis of EXAFS spectra by conventional shell fitting, the Tikhonov regularization method has been proposed. An improved algorithm that utilizes a priori information about the sample has been developed and applied to the analysis of U L3-edge spectra of soddyite, (UO2)2SiO4{center_dot}2H2O, and of U(VI) sorbed onto kaolinite. The partial radial distribution functions g1(UU), g2(USi), and g3(UO) of soddyite agree with crystallographic values and previous EXAFS results.

  13. Characterization of polarization-independent phase modulation method for practical plug and play quantum cryptography

    NASA Astrophysics Data System (ADS)

    Kwon, Osung; Lee, Min-Soo; Woo, Min Ki; Park, Byung Kwon; Kim, Il Young; Kim, Yong-Su; Han, Sang-Wook; Moon, Sung

    2015-12-01

    We characterized a polarization-independent phase modulation method, called double phase modulation, for a practical plug and play quantum key distribution (QKD) system. Following investigation of theoretical backgrounds, we applied the method to the practical QKD system and characterized the performance through comparing single phase modulation (SPM) and double phase modulation. Consequently, we obtained repeatable and accurate phase modulation confirmed by high visibility single photon interference even for input signals with arbitrary polarization. Further, the results show that only 80% of the bias voltage required in the case of single phase modulation is needed to obtain the target amount of phase modulation.

  14. Spectroscopic Chemical Analysis Methods and Apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F.; Reid, Ray D.

    2012-01-01

    This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses ballistic electron beam injection directly into the active region of a wide bandgap semiconductor material.

  15. Groundwater Flooding: Practical Methods for the Estimation of Extreme Groundwater Levels

    NASA Astrophysics Data System (ADS)

    Bichler, A.; Fürst, J.

    2012-04-01

    Floods are in general recognized as a consequence of high flows in surface waters. Only recently awareness has been raised for potential flooding and flood risk from groundwater sources. In particular, information about high groundwater levels is relevant where basements of buildings or vulnerable installations might be affected. Also, the EU Floods Directive addresses the potential flood risk arising from groundwater sources. While the statistical analysis of extreme values is widely used in surface hydrology, there are currently only few studies that consider the specific properties of extreme groundwater levels. The main objective of this investigation is the application of at-site and regional frequency analysis in the field of hydrogeology. Extreme groundwater levels with a given return period (e.g. 100 years) are estimated with the method of L-moments and their uncertainty is quantified. Moreover, software tools are developed in order to make extreme value analysis a feasible technique for practical application by the Austrian Hydrological Service. These tools address demand for user-friendly handling as well as integration and an update of existing and readily derivable data. Lastly, the estimates are regionalized, thus information of extreme groundwater levels and accuracy of estimation can be retrieved at any point of the investigation area. The analysis is applied in four shallow, porous aquifers in Austria, with a total of more than 1000 time series records of groundwater levels, covering 10 - 50 years of observation. Firstly, local frequency analysis (LFA) is performed on a series of annual maximum peaks. The analysis of annual maxima allows for easy handling, but comes with the drawback of requiring 20-30 years of observation as minimum sample size. Due to anthropogenic impacts, natural changes of the hydrologic system, etc. this requirement cannot be met in numerous cases. Hence, the peaks over threshold (POT) approach and regional frequency analysis (RFA) is implemented. Thus, sufficiently large sample size can be derived from shorter time series either by selecting exceedances over a variable threshold (POT), or accounting for data from related observations (RFA, "trading space for time"). The results show, that at-site frequency analysis is applicable at 63% of the records, at which the peaks over threshold method yields more accurate estimates compared to the annual maxima. Regional frequency analysis can be applied at 51% of the samples and results in even further reduction of uncertainty. In the four case studies 12 - 45 % of the investigated area is susceptible to groundwater flood risk, i.e. an event with a return period of 100 years is likely to reach the terrain surface. As one of the outcomes, maps of depth to the groundwater table make it possible to identify areas prone to groundwater flooding or suitable for development at a glance.

  16. Analysis of sourcing & procurement practices : a cross industry framework

    E-print Network

    Koliousis, Ioannis G

    2006-01-01

    This thesis presents and analyzes the various practices in the functional area of Sourcing and Procurement. The 21 firms that are studied operate in one of the following industries: Aerospace, Apparel/ Footwear, Automotive, ...

  17. Analysis of structural perturbations in systems via cost decomposition methods

    NASA Technical Reports Server (NTRS)

    Skelton, R. E.

    1983-01-01

    It has long been common practice to analyze linear dynamic systems by decomposing the total response in terms of individual contributions which are easier to analyze. Examples of this philosophy include the expansion of transfer functions using: (1) the superposition principle, (2) residue theory and partial fraction expansions, (3) Markov parameters, Hankel matrices, and (4) regular and singular perturbations. This paper summarizes a new and different kind of expansion designed to decompose the norm of the response vector rather than the response vector itself. This is referred to as "cost-decomposition' of the system. The notable advantages of this type of decomposition are: (a) easy application to multi-input, multi-output systems, (b) natural compatibility with Linear Quadratic Gaussian Theory, (c) applicability to the analysis of more general types of structural perturbations involving inputs, outputs, states, parameters. Property (c) makes the method suitable for problems in model reduction, measurement/actuator selections, and sensitivity analysis.

  18. PRACTICAL SENSITIVITY AND UNCERTAINTY ANALYSIS TECHNIQUES APPLIED TO AGRICULTURAL SYSTEMS MODELS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We present a practical evaluation framework for analysis of two complex, process-based agricultural system models, WEPP and RZWQM. The evaluation framework combines sensitivity analysis and the uncertainty analysis techniques of first order error analysis (FOA) and Monte Carlo simulation with Latin ...

  19. Comparison of analysis methods for airway quantification

    NASA Astrophysics Data System (ADS)

    Odry, Benjamin L.; Kiraly, Atilla P.; Novak, Carol L.; Naidich, David P.

    2012-03-01

    Diseased airways have been known for several years as a possible contributing factor to airflow limitation in Chronic Obstructive Pulmonary Diseases (COPD). Quantification of disease severity through the evaluation of airway dimensions - wall thickness and lumen diameter - has gained increased attention, thanks to the availability of multi-slice computed tomography (CT). Novel approaches have focused on automated methods of measurement as a faster and more objective means that the visual assessment routinely employed in the clinic. Since the Full-Width Half-Maximum (FWHM) method of airway measurement was introduced two decades ago [1], several new techniques for quantifying airways have been detailed in the literature, but no approach has truly become a standard for such analysis. Our own research group has presented two alternative approaches for determining airway dimensions, one involving a minimum path and the other active contours [2, 3]. With an increasing number of techniques dedicated to the same goal, we decided to take a step back and analyze the differences of these methods. We consequently put to the test our two methods of analysis and the FWHM approach. We first measured a set of 5 airways from a phantom of known dimensions. Then we compared measurements from the three methods to those of two independent readers, performed on 35 airways in 5 patients. We elaborate on the differences of each approach and suggest conclusions on which could be defined as the best one.

  20. TECHNICAL REVIEW A practical guide to methods of parentage analysis

    E-print Network

    Jones, Adam

    ' (Jeffreys et al. 1985). This multi-locus DNA fingerprinting approach was rapidly adopted by avian the spread of DNA fingerprinting applications outside of birds and mammals. Several years after the development of DNA fingerprinting, the discovery of microsatellite markers (Tautz 1989), also known as simple

  1. A method for communication analysis in prosthodontics.

    PubMed

    Sondell, K; Söderfeldt, B; Palmqvist, S

    1998-02-01

    Particularly in prosthodontics, in which the issues of esthetic preferences and possibilities are abundant, improved knowledge about dentist patient communication during clinical encounters is important. Because previous studies on communication used different methods and patient materials, the results are difficult to evaluate. There is, therefore, a need for methodologic development. One method that makes it possible to quantitatively describe different interaction behaviors during clinical encounters is the Roter Method of Interaction Process Analysis (RIAS). Since the method was developed in the USA for use in the medical context, a translation of the method into Swedish and a modification of the categories for use in prosthodontics were necessary. The revised manual was used to code 10 audio recordings of dentist patient encounters at a specialist clinic for prosthodontics. No major alterations of the RIAS manual were made during the translation and modification. The study shows that it is possible to distinguish patterns of communication in audio-recorded dentist patient encounters. The method also made the identification of different interaction profiles possible. These profiles distinguished well among the audio-recorded encounters. The coding procedures were tested for intra-rater reliability and found to be 97% for utterance classification and lambda = 0.76 for categorization definition. It was concluded that the revised RIAS method is applicable in communication studies in prosthodontics. PMID:9537735

  2. Digital dream analysis: a revised method.

    PubMed

    Bulkeley, Kelly

    2014-10-01

    This article demonstrates the use of a digital word search method designed to provide greater accuracy, objectivity, and speed in the study of dreams. A revised template of 40 word search categories, built into the website of the Sleep and Dream Database (SDDb), is applied to four "classic" sets of dreams: The male and female "Norm" dreams of Hall and Van de Castle (1966), the "Engine Man" dreams discussed by Hobson (1988), and the "Barb Sanders Baseline 250" dreams examined by Domhoff (2003). A word search analysis of these original dream reports shows that a digital approach can accurately identify many of the same distinctive patterns of content found by previous investigators using much more laborious and time-consuming methods. The results of this study emphasize the compatibility of word search technologies with traditional approaches to dream content analysis. PMID:25286125

  3. Colored particle tracking method for mixing analysis of chaotic micromixers

    NASA Astrophysics Data System (ADS)

    Kang, Tae Gon; Kwon, Tai Hun

    2004-07-01

    Micromixers have a variety of applications in chemical and biological processes, becoming an important component in microfluidic systems. The present work aims at understanding detailed mixing behaviour of micromixers by developing a numerical analysis scheme, which ultimately facilitates efficient micromixer design. A systematic numerical method has been developed, enabling visualization of detailed mixing patterns and quantification of the mixing performance in chaotic micromixers. The overall numerical scheme is named 'colored particle tracking method' (CPTM), consisting of three steps: (i) a flow analysis to obtain a periodic velocity field of a periodic mixing protocol by the Galerkin/least-squares (GLS) method; (ii) a particle tracking step, particles being labeled by a specific color at the inlet according to fluid species, to obtain a distribution of colored particles at the end of the final period; (iii) a quantification of the degree of mixing from the obtained particle distribution. For the last step we propose a new mixing measure based on the information entropy. The CPTM has successfully been applied to three examples of micromixers with patterned grooves to evaluate their mixing performance both qualitatively and quantitatively. The CPTM seems promising as a practically attractive numerical scheme for mixing analysis of chaotic micromixers.

  4. Finite element methods for integrated aerodynamic heating analysis

    NASA Technical Reports Server (NTRS)

    Peraire, J.

    1990-01-01

    Over the past few years finite element based procedures for the solution of high speed viscous compressible flows were developed. The objective of this research is to build upon the finite element concepts which have already been demonstrated and to develop these ideas to produce a method which is applicable to the solution of large scale practical problems. The problems of interest range from three dimensional full vehicle Euler simulations to local analysis of three-dimensional viscous laminar flow. Transient Euler flow simulations involving moving bodies are also to be included. An important feature of the research is to be the coupling of the flow solution methods with thermal/structural modeling techniques to provide an integrated fluid/thermal/structural modeling capability. The progress made towards achieving these goals during the first twelve month period of the research is presented.

  5. Finite Volume Methods: Foundation and Analysis

    NASA Technical Reports Server (NTRS)

    Barth, Timothy; Ohlberger, Mario

    2003-01-01

    Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

  6. Probabilistic methods in fire-risk analysis

    SciTech Connect

    Brandyberry, M.D.

    1989-01-01

    The first part of this work outlines a method for assessing the frequency of ignition of a consumer product in a building and shows how the method would be used in an example scenario utilizing upholstered furniture as the product and radiant auxiliary heating devices (electric heaters, wood stoves) as the ignition source. Deterministic thermal models of the heat-transport processes are coupled with parameter uncertainty analysis of the models and with a probabilistic analysis of the events involved in a typical scenario. This leads to a distribution for the frequency of ignition for the product. In second part, fire-risk analysis as currently used in nuclear plants is outlines along with a discussion of the relevant uncertainties. The use of the computer code COMPBRN is discussed for use in the fire-growth analysis along with the use of response-surface methodology to quantify uncertainties in the code's use. Generalized response surfaces are developed for temperature versus time for a cable tray, as well as a surface for the hot gas layer temperature and depth for a room of arbitrary geometry within a typical nuclear power plant compartment. These surfaces are then used to simulate the cable tray damage time in a compartment fire experiment.

  7. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)

    2013-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  8. A Simplified Mesh Deformation Method Using Commercial Structural Analysis Software

    NASA Technical Reports Server (NTRS)

    Hsu, Su-Yuen; Chang, Chau-Lyan; Samareh, Jamshid

    2004-01-01

    Mesh deformation in response to redefined or moving aerodynamic surface geometries is a frequently encountered task in many applications. Most existing methods are either mathematically too complex or computationally too expensive for usage in practical design and optimization. We propose a simplified mesh deformation method based on linear elastic finite element analyses that can be easily implemented by using commercially available structural analysis software. Using a prescribed displacement at the mesh boundaries, a simple structural analysis is constructed based on a spatially varying Young s modulus to move the entire mesh in accordance with the surface geometry redefinitions. A variety of surface movements, such as translation, rotation, or incremental surface reshaping that often takes place in an optimization procedure, may be handled by the present method. We describe the numerical formulation and implementation using the NASTRAN software in this paper. The use of commercial software bypasses tedious reimplementation and takes advantage of the computational efficiency offered by the vendor. A two-dimensional airfoil mesh and a three-dimensional aircraft mesh were used as test cases to demonstrate the effectiveness of the proposed method. Euler and Navier-Stokes calculations were performed for the deformed two-dimensional meshes.

  9. Review of Computational Stirling Analysis Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  10. Optical methods for the analysis of dermatopharmacokinetics

    NASA Astrophysics Data System (ADS)

    Lademann, Juergen; Weigmann, Hans-Juergen; von Pelchrzim, R.; Sterry, Wolfram

    2002-07-01

    The method of tape stripping in combination with spectroscopic measurements is a simple and noninvasive method for the analysis of dermatopharmacokinetics of cosmetic products and topically applied drugs. The absorbance at 430 nm was used for the characterization of the amount of corneocytes on the tape strips. It was compared to the increase of weight of the tapes after removing them from the skin surface. The penetration profiles of two UV filter substances used in sunscreens were determined. The combined method of tape stripping and spectroscopic measurements can be also used for the investigation of the dermatopharmacokinetics of topically applied drugs passing through the skin. Differences in the penetration profiles of the steroid compound clobetasol, applied in the same concentration in different formulations on the skin are presented.

  11. Stirling Analysis Comparison of Commercial Versus High-Order Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako

    2005-01-01

    Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's compact scheme and Dyson's Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model with sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

  12. Stirling Analysis Comparison of Commercial vs. High-Order Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako

    2007-01-01

    Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/ proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's Compact scheme and Dyson s Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model although sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

  13. What Informs Practice and What Is Valued in Corporate Instructional Design? A Mixed Methods Study

    ERIC Educational Resources Information Center

    Thompson-Sellers, Ingrid N.

    2012-01-01

    This study used a two-phased explanatory mixed-methods design to explore in-depth what factors are perceived by Instructional Design and Technology (IDT) professionals as impacting instructional design practice, how these factors are valued in the field, and what differences in perspectives exist between IDT managers and non-managers. For phase 1…

  14. Investigation of acute gastroenteritis in general practice — relevance of newer laboratory methods

    PubMed Central

    Rousseau, S. A.

    1983-01-01

    Over a nine-month period, all patients suffering from acute gastroenteritis, with diarrhoea as an essential component, who presented to a group practice in southern England were investigated using conventional laboratory methods, and also newer techniques of electron microscopy and search for species of Campylobacter. Rotavirus and Campylobacter were the two most commonly encountered pathogens. PMID:6887127

  15. Texas A & M University-Central Texas CPSY 557 Methods and Practices in Counseling and Psychology

    E-print Network

    Diestel, Geoff

    Texas A & M University- Central Texas CPSY 557 Methods and Practices in Counseling and Psychology;00 p.m. Class Time: Classroom: Unilert Tuesday 1:30-3:00 & 7:30 -8:00p.m. Wednesday 1:30-3:00 7 at TAMU.org/UNILERT. 1,0 Overview: This course is designed to introduce Counseling and Psychology pre

  16. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 1 2012-04-01 2012-04-01 false Imported articles involving unfair methods of competition or practices. 12.39 Section 12.39 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY SPECIAL CLASSES OF MERCHANDISE Unfair Competition § 12.39 Imported articles involving...

  17. Passive Sampling Methods for Contaminated Sediments: Practical Guidance for Selection, Calibration, and Implementation

    EPA Science Inventory

    This article provides practical guidance on the use of passive sampling methods(PSMs) that target the freely dissolved concentration (Cfree) for improved exposure assessment of hydrophobic organic chemicals in sediments. Primary considerations for selecting a PSM for a specific a...

  18. Best Practices and Design Patterns for Product Development a method to transfer design knowledge

    E-print Network

    Salustri, Filippo A.

    Best Practices and Design Patterns for Product Development a method to transfer design knowledge Dr a measurable factor that impacts the health of an organisation Design Pattern: a three-part rule expressing supermarket checkouts (?) best really only means better patterns capture principles benchmarking is key

  19. Self-Assessment Methods in Writing Instruction: A Conceptual Framework, Successful Practices and Essential Strategies

    ERIC Educational Resources Information Center

    Nielsen, Kristen

    2014-01-01

    Student writing achievement is essential to lifelong learner success, but supporting writing can be challenging for teachers. Several large-scale analyses of publications on writing have called for further study of instructional methods, as the current literature does not sufficiently address the need to support best teaching practices.…

  20. Comparison and cost analysis of drinking water quality monitoring requirements versus practice in seven developing countries.

    PubMed

    Crocker, Jonny; Bartram, Jamie

    2014-07-01

    Drinking water quality monitoring programs aim to support provision of safe drinking water by informing water quality management. Little evidence or guidance exists on best monitoring practices for low resource settings. Lack of financial, human, and technological resources reduce a country's ability to monitor water supply. Monitoring activities were characterized in Cambodia, Colombia, India (three states), Jordan, Peru, South Africa, and Uganda according to water sector responsibilities, monitoring approaches, and marginal cost. The seven study countries were selected to represent a range of low resource settings. The focus was on monitoring of microbiological parameters, such as E. coli, coliforms, and H2S-producing microorganisms. Data collection involved qualitative and quantitative methods. Across seven study countries, few distinct approaches to monitoring were observed, and in all but one country all monitoring relied on fixed laboratories for sample analysis. Compliance with monitoring requirements was highest for operational monitoring of large water supplies in urban areas. Sample transport and labor for sample collection and analysis together constitute approximately 75% of marginal costs, which exclude capital costs. There is potential for substantive optimization of monitoring programs by considering field-based testing and by fundamentally reconsidering monitoring approaches for non-piped supplies. This is the first study to look quantitatively at water quality monitoring practices in multiple developing countries. PMID:25046632

  1. A practical approach to fire hazard analysis for offshore structures.

    PubMed

    Krueger, Joel; Smith, Duncan

    2003-11-14

    Offshore quantitative risk assessments (QRA) have historically been complex and costly. For large offshore design projects, the level of detail required for a QRA is often not available until well into the detailed design phase of the project. In these cases, the QRA may be unable to provide timely hazard understanding. As a result, the risk reduction measures identified often come too late to allow for cost effective changes to be implemented. This forces project management to make a number of difficult or costly decisions. This paper demonstrates how a scenario-based approached to fire risk assessment can be effectively applied early in a project's development. The scenario or design basis fire approach calculates the consequence of a select number of credible fire scenarios, determines the potential impact on the platform process equipment, structural members, egress routes, safety systems, and determines the effectiveness of potential options for mitigation. The early provision of hazard data allows the project team to select an optimum design that is safe and will meet corporate or regulatory risk criteria later in the project cycle. The focus of this paper is on the application of the scenario-based approach to gas jet fires. This paper draws on recent experience in the Gulf of Mexico (GOM) and other areas to outline an approach to fire hazard analysis and fire hazard management for deep-water structures. The methods presented will include discussions from the recent June 2002 International Workshop for Fire Loading and Response. PMID:14602403

  2. Incorporating Computer-Aided Language Sample Analysis into Clinical Practice

    ERIC Educational Resources Information Center

    Price, Lisa Hammett; Hendricks, Sean; Cook, Colleen

    2010-01-01

    Purpose: During the evaluation of language abilities, the needs of the child are best served when multiple types and sources of data are included in the evaluation process. Current educational policies and practice guidelines further dictate the use of authentic assessment data to inform diagnosis and treatment planning. Language sampling and…

  3. Honesty in Critically Reflective Essays: An Analysis of Student Practice

    ERIC Educational Resources Information Center

    Maloney, Stephen; Tai, Joanna Hong-Meng; Lo, Kristin; Molloy, Elizabeth; Ilic, Dragan

    2013-01-01

    In health professional education, reflective practice is seen as a potential means for self-improvement from everyday clinical encounters. This study aims to examine the level of student honesty in critical reflection, and barriers and facilitators for students engaging in honest reflection. Third year physiotherapy students, completing summative…

  4. An Analysis of Teacher Practices with Toddlers during Social Conflicts

    ERIC Educational Resources Information Center

    Gloeckler, Lissy R.; Cassell, Jennifer M.; Malkus, Amy J.

    2014-01-01

    Employing a quasi-experimental design, this pilot study on teacher practices with toddlers during social conflicts was conducted in the southeastern USA. Four child-care classrooms, teachers (n?=?8) and children (n?=?51) were assessed with the Classroom Assessment Scoring System -- Toddler [CLASS-Toddler; La Paro, K., Hamre, B. K., & Pianta,…

  5. Professional Learning in Rural Practice: A Sociomaterial Analysis

    ERIC Educational Resources Information Center

    Slade, Bonnie

    2013-01-01

    Purpose: This paper aims to examine the professional learning of rural police officers. Design/methodology/approach: This qualitative case study involved interviews and focus groups with 34 police officers in Northern Scotland. The interviews and focus groups were transcribed and analysed, drawing on practice-based and sociomaterial learning…

  6. A cross-sectional mixed methods study protocol to generate learning from patient safety incidents reported from general practice

    PubMed Central

    Carson-Stevens, Andrew; Hibbert, Peter; Avery, Anthony; Butlin, Amy; Carter, Ben; Cooper, Alison; Evans, Huw Prosser; Gibson, Russell; Luff, Donna; Makeham, Meredith; McEnhill, Paul; Panesar, Sukhmeet S; Parry, Gareth; Rees, Philippa; Shiels, Emma; Sheikh, Aziz; Ward, Hope Olivia; Williams, Huw; Wood, Fiona; Donaldson, Liam; Edwards, Adrian

    2015-01-01

    Introduction Incident reports contain descriptions of errors and harms that occurred during clinical care delivery. Few observational studies have characterised incidents from general practice, and none of these have been from the England and Wales National Reporting and Learning System. This study aims to describe incidents reported from a general practice care setting. Methods and analysis A general practice patient safety incident classification will be developed to characterise patient safety incidents. A weighted-random sample of 12?500 incidents describing no harm, low harm and moderate harm of patients, and all incidents describing severe harm and death of patients will be classified. Insights from exploratory descriptive statistics and thematic analysis will be combined to identify priority areas for future interventions. Ethics and dissemination The need for ethical approval was waivered by the Aneurin Bevan University Health Board research risk review committee given the anonymised nature of data (ABHB R&D Ref number: SA/410/13). The authors will submit the results of the study to relevant journals and undertake national and international oral presentations to researchers, clinicians and policymakers. PMID:26628526

  7. Text analysis devices, articles of manufacture, and text analysis methods

    DOEpatents

    Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C

    2015-03-31

    Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes a display configured to depict visible images, and processing circuitry coupled with the display and wherein the processing circuitry is configured to access a first vector of a text item and which comprises a plurality of components, to access a second vector of the text item and which comprises a plurality of components, to weight the components of the first vector providing a plurality of weighted values, to weight the components of the second vector providing a plurality of weighted values, and to combine the weighted values of the first vector with the weighted values of the second vector to provide a third vector.

  8. Entropy Methods For Univariate Distributions in Decision Analysis

    NASA Astrophysics Data System (ADS)

    Abbas, Ali E.

    2003-03-01

    One of the most important steps in decision analysis practice is the elicitation of the decision-maker's belief about an uncertainty of interest in the form of a representative probability distribution. However, the probability elicitation process is a task that involves many cognitive and motivational biases. Alternatively, the decision-maker may provide other information about the distribution of interest, such as its moments, and the maximum entropy method can be used to obtain a full distribution subject to the given moment constraints. In practice however, decision makers cannot readily provide moments for the distribution, and are much more comfortable providing information about the fractiles of the distribution of interest or bounds on its cumulative probabilities. In this paper we present a graphical method to determine the maximum entropy distribution between upper and lower probability bounds and provide an interpretation for the shape of the maximum entropy distribution subject to fractile constraints, (FMED). We also discuss the problems with the FMED in that it is discontinuous and flat over each fractile interval. We present a heuristic approximation to a distribution if in addition to its fractiles, we also know it is continuous and work through full examples to illustrate the approach.

  9. Influence of Analysis Methods on Interpretation of Hazard Maps

    PubMed Central

    Koehler, Kirsten A.

    2013-01-01

    Exposure or hazard mapping is becoming increasingly popular among industrial hygienists. Direct-reading instruments used for hazard mapping of data collection are steadily increasing in reliability and portability while decreasing in cost. Exposure measurements made with these instruments generally require no laboratory analysis although hazard mapping can be a time-consuming process. To inform decision making by industrial hygienists and management, it is crucial that the maps generated from mapping data are as accurate and representative as possible. Currently, it is unclear how many sampling locations are necessary to produce a representative hazard map. As such, researchers typically collect as many points as can be sampled in several hours and interpolation methods are used to produce higher resolution maps. We have reanalyzed hazard-mapping data sets from three industrial settings to determine which interpolation methods yield the most accurate results. The goal is to provide practicing industrial hygienists with some practical guidelines to generate accurate hazard maps with ‘off-the-shelf’ mapping software. Visually verifying the fit of the variogram model is crucial for accurate interpolation. Exponential and spherical variogram models performed better than Gaussian models. It was also necessary to diverge from some of the default interpolation parameters such as the number of bins used for the experimental variogram and whether or not to allow for a nugget effect to achieve reasonable accuracy of the interpolation for some data sets. PMID:23258453

  10. Empirical Analysis of Green Supply Chain Management Practices in Indian Automobile Industry

    NASA Astrophysics Data System (ADS)

    Luthra, S.; Garg, D.; Haleem, A.

    2014-04-01

    Environmental sustainability and green environmental issues have an increasing popularity among researchers and supply chain practitioners. An attempt has been made to identify and empirically analyze green supply chain management (GSCM) practices in Indian automobile industry. Six main GSCM practices (having 37 sub practices) and four expected performance outcomes (having 16 performances) have been identified by implementing GSCM practices from literature review. Questionnaire based survey has been made to validate these practices and performance outcomes. 123 complete questionnaires were collected from Indian automobile organizations and used for empirical analysis of GSCM practices in Indian automobile industry. Descriptive statistics have been used to know current implementation status of GSCM practices in Indian automobile industry and multiple regression analysis has been carried out to know the impact on expected organizational performance outcomes by current GSCM practices adopted by Indian automobile industry. The results of study suggested that environmental, economic, social and operational performances improve with the implementation of GSCM practices. This paper may play an important role to understand various GSCM implementation issues and help practicing managers to improve their performances in the supply chain.

  11. The exergy method of thermal plant analysis

    SciTech Connect

    Kotas, T.J.

    1985-01-01

    The Exergy Method, also known as ''availability analysis,'' is a technique of thermodynamic analysis which uses the Second Law of Thermodynamics as its basis of assessment. Recent developments in the technique, combined with the increasing need to conserve fuel, has attracted much attention. Its advantages over traditional techniques using the First Law are now generally recognized. The book introduces the subject in a manner that can be understood by anyone familiar with the fundamentals of applies thermodynamics. Numerous examples will help the reader to understand the basic concepts and master the techniques. There are also many tables and charts for calculations in thermoeconomics, refrigeration, cryogenic processes, combustion power generation and various aspects of chemical and process engineering.

  12. Language Ideology or Language Practice? An Analysis of Language Policy Documents at Swedish Universities

    ERIC Educational Resources Information Center

    Björkman, Beyza

    2014-01-01

    This article presents an analysis and interpretation of language policy documents from eight Swedish universities with regard to intertextuality, authorship and content analysis of the notions of language practices and English as a lingua franca (ELF). The analysis is then linked to Spolsky's framework of language policy, namely language…

  13. Factor Analysis in Counseling Psychology Research, Training, and Practice: Principles, Advances, and Applications

    ERIC Educational Resources Information Center

    Kahn, Jeffrey H.

    2006-01-01

    Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) have contributed to test development and validation in counseling psychology, but additional applications have not been fully realized. The author presents an overview of the goals, terminology, and procedures of factor analysis; reviews best practices for extracting,…

  14. Practice of Physical Activity among Future Doctors: A Cross Sectional Analysis

    PubMed Central

    Rao, Chythra R; Darshan, BB; Das, Nairita; Rajan, Vinaya; Bhogun, Meemansha; Gupta, Aditya

    2012-01-01

    Background: Non communicable diseases (NCD) will account for 73% of deaths and 60% of the global disease burden by 2020. Physical activity plays a major role in the prevention of these non-communicable diseases. The stress involved in meeting responsibilities of becoming a physician may adversely affect the exercise habits of students. So, the current study aimed to study the practice of physical activity among undergraduate medical students. Methods: A cross sectional study was conducted among 240 undergraduate medical students. Quota sampling method was used to identify 60 students from each of the four even semesters. A pre-tested, semi-structured questionnaire was used to collect the data. Statistical Package for Social Sciences (SPSS) version 16 was used for data entry and analysis and results are expressed as percentages and proportions. Results: In our study, 55% were 20 to 22 years old. Over half of the students were utilizing the sports facilities provided by the university in the campus. Majority of students 165 (69%) had normal body mass index (BMI), (51) 21% were overweight, while 7 (3%) were obese. Of the 62% who were currently exercising, the practice of physical activity was more among boys as compared to girls (62% v/s 38%). Lack of time 46 (60.5%), laziness (61.8%), and exhaustion from academic activities (42%) were identified as important hindering factors among medical students who did not exercise. Conclusion: A longitudinal study to follow-up student behavior throughout their academic life is needed to identify the factors promoting the practice of physical activity among students. PMID:22708033

  15. Method and apparatus for simultaneous spectroelectrochemical analysis

    DOEpatents

    Chatterjee, Sayandev; Bryan, Samuel A; Schroll, Cynthia A; Heineman, William R

    2013-11-19

    An apparatus and method of simultaneous spectroelectrochemical analysis is disclosed. A transparent surface is provided. An analyte solution on the transparent surface is contacted with a working electrode and at least one other electrode. Light from a light source is focused on either a surface of the working electrode or the analyte solution. The light reflected from either the surface of the working electrode or the analyte solution is detected. The potential of the working electrode is adjusted, and spectroscopic changes of the analyte solution that occur with changes in thermodynamic potentials are monitored.

  16. Analysis method for Fourier transform spectroscopy

    NASA Technical Reports Server (NTRS)

    Park, J. H.

    1983-01-01

    A fast Fourier transform technique is given for the simulation of those distortion effects in the instrument line shape of the interferometric spectrum that are due to errors in the measured interferogram. The technique is applied to analyses of atmospheric absorption spectra and laboratory spectra. It is shown that the nonlinear least squares method can retrieve the correct information from the distorted spectrum. Analyses of HF absorption spectra obtained in a laboratory and solar CO absorption spectra gathered by a balloon-borne interferometer indicate that the retrieved amount of absorbing gas is less than the correct value in most cases, if the interferogram distortion effects are not included in the analysis.

  17. Apparatus and method for fluid analysis

    DOEpatents

    Wilson, Bary W.; Peters, Timothy J.; Shepard, Chester L.; Reeves, James H.

    2004-11-02

    The present invention is an apparatus and method for analyzing a fluid used in a machine or in an industrial process line. The apparatus has at least one meter placed proximate the machine or process line and in contact with the machine or process fluid for measuring at least one parameter related to the fluid. The at least one parameter is a standard laboratory analysis parameter. The at least one meter includes but is not limited to viscometer, element meter, optical meter, particulate meter, and combinations thereof.

  18. Scanning methods applied to bitemark analysis

    NASA Astrophysics Data System (ADS)

    Bush, Peter J.; Bush, Mary A.

    2010-06-01

    The 2009 National Academy of Sciences report on forensics focused criticism on pattern evidence subdisciplines in which statements of unique identity are utilized. One principle of bitemark analysis is that the human dentition is unique to the extent that a perpetrator may be identified based on dental traits in a bitemark. Optical and electron scanning methods were used to measure dental minutia and to investigate replication of detail in human skin. Results indicated that being a visco-elastic substrate, skin effectively reduces the resolution of measurement of dental detail. Conclusions indicate caution in individualization statements.

  19. Thermal Analysis Methods for Aerobraking Heating

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.; Gasbarre, Joseph F.; Dec, John A.

    2005-01-01

    As NASA begins exploration of other planets, a method of non-propulsively slowing vehicles at the planet, aerobraking, may become a valuable technique for managing vehicle design mass and propellant. An example of this is Mars Reconnaissance Orbiter (MRO), which will launch in late 2005 and reach Mars in March of 2006. In order to save propellant, MRO will use aerobraking to modify the initial orbit at Mars. The spacecraft will dip into the atmosphere briefly on each orbit, and during the drag pass, the atmospheric drag on the spacecraft will slow it, thus lowering the orbit apoapsis. The largest area on the spacecraft, and that most affected by the heat generated during the aerobraking process, is the solar arrays. A thermal analysis of the solar arrays was conducted at NASA Langley, to simulate their performance throughout the entire roughly 6-month period of aerobraking. Several interesting methods were used to make this analysis more rapid and robust. Two separate models were built for this analysis, one in Thermal Desktop for radiation and orbital heating analysis, and one in MSC.Patran for thermal analysis. The results from the radiation model were mapped in an automated fashion to the Patran thermal model that was used to analyze the thermal behavior during the drag pass. A high degree of automation in file manipulation as well as other methods for reducing run time were employed, since toward the end of the aerobraking period the orbit period is short, and in order to support flight operations the runs must be computed rapidly. All heating within the Patran Thermal model was combined in one section of logic, such that data mapped from the radiation model and aeroheating model, as well as skin temperature effects on the aeroheating and surface radiation, could be incorporated easily. This approach calculates the aeroheating at any given node, based on its position and temperature as well as the density and velocity at that trajectory point. Run times on several different processors, computer hard drives, and operating systems (Windows versus Linux) were evaluated.

  20. Selective spectroscopic methods for water analysis

    SciTech Connect

    Vaidya, B.

    1997-06-24

    This dissertation explores in large part the development of a few types of spectroscopic methods in the analysis of water. Methods for the determination of some of the most important properties of water like pH, metal ion content, and chemical oxygen demand are investigated in detail. This report contains a general introduction to the subject and the conclusions. Four chapters and an appendix have been processed separately. They are: chromogenic and fluorogenic crown ether compounds for the selective extraction and determination of Hg(II); selective determination of cadmium in water using a chromogenic crown ether in a mixed micellar solution; reduction of chloride interference in chemical oxygen demand determination without using mercury salts; structural orientation patterns for a series of anthraquinone sulfonates adsorbed at an aminophenol thiolate monolayer chemisorbed at gold; and the role of chemically modified surfaces in the construction of miniaturized analytical instrumentation.

  1. 78 FR 48636 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-09

    ...Practice and Hazard Analysis and Risk- Based...Controls for Human Food; Extension of Comment...Practice and Hazard Analysis and Risk-Based...Controls for Human Food,'' that appeared...Practice and Hazard Analysis and Risk-Based...Controls for Human Food'' with a...

  2. 78 FR 11611 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-19

    ...Practice and Hazard Analysis and Risk- Based...Controls for Human Food; Extension of Comment...Practice and Hazard Analysis and Risk-Based...Controls for Human Food'' that appeared...Practice and Hazard Analysis and Risk-Based...Controls for Human Food.'' FOR...

  3. 78 FR 69604 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ...Practice and Hazard Analysis and Risk- Based...Controls for Human Food; Extension of Comment...Practice and Hazard Analysis and Risk- Based...Controls for Human Food'' and its information...Practice and Hazard Analysis and Risk-Based...Controls for Human Food.'' The...

  4. 78 FR 24691 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-26

    ...Practice and Hazard Analysis and Risk- Based...Controls for Human Food; Extension of Comment...Practice and Hazard Analysis and Risk-Based...Controls for Human Food'' that appeared...Practice and Hazard Analysis and Risk-Based...Controls for Human Food'' with a...

  5. Practical Performance Analysis of Parallel Applications (06-07 October 2014, CSCS, Lugano) Introduction to

    E-print Network

    were executed to the shortest which is possible." Charles Babbage 1791 ­ 1871 Difference Engine #12: premature optimization is the root of all evil." Charles A. R. Hoare #12;Practical Performance Analysis

  6. Research Article A practical map-analysis tool for detecting potential dispersal corridors*

    E-print Network

    Hoffman, Forrest M.

    -1 Research Article A practical map-analysis tool for detecting potential dispersal corridors* William W. Hargrove1,* , Forrest M. Hoffman1,2 and Rebecca A. Efroymson1 1 Environmental Sciences Division

  7. Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis

    PubMed Central

    Critchfield, Thomas S

    2011-01-01

    Neither practitioners nor scientists appear to be fully satisfied with the world's largest behavior-analytic membership organization. Each community appears to believe that initiatives that serve the other will undermine the association's capacity to serve their own needs. Historical examples suggest that such discord is predicted when practitioners and scientists cohabit the same association. This is true because all professional associations exist to address guild interests, and practice and science are different professions with different guild interests. No association, therefore, can succeed in being all things to all people. The solution is to assure that practice and science communities are well served by separate professional associations. I comment briefly on how this outcome might be promoted. PMID:22532750

  8. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... latest edition (13th Ed., 1980) of their publication “Official Methods of Analysis of the Association of... 21 Food and Drugs 1 2011-04-01 2011-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of...

  9. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Service, Dairy Programs, or Official Methods of Analysis of the Association of Analytical Chemists or... 7 Agriculture 3 2014-01-01 2014-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods...

  10. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... latest edition (13th Ed., 1980) of their publication “Official Methods of Analysis of the Association of... 21 Food and Drugs 1 2012-04-01 2012-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of...

  11. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Service, Dairy Programs, or Official Methods of Analysis of the Association of Analytical Chemists or... 7 Agriculture 3 2013-01-01 2013-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods...

  12. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Service, Dairy Programs, or Official Methods of Analysis of the Association of Analytical Chemists or... 7 Agriculture 3 2012-01-01 2012-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods...

  13. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... latest edition (13th Ed., 1980) of their publication “Official Methods of Analysis of the Association of... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of...

  14. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... latest edition (13th Ed., 1980) of their publication “Official Methods of Analysis of the Association of... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of...

  15. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... latest edition (13th Ed., 1980) of their publication “Official Methods of Analysis of the Association of... 21 Food and Drugs 1 2014-04-01 2014-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of...

  16. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Service, Dairy Programs, or Official Methods of Analysis of the Association of Analytical Chemists or... 7 Agriculture 3 2011-01-01 2011-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods...

  17. [Standardized patients in general practice--a new method for quality assurance in Noway].

    PubMed

    Saebø, L; Rethans, J J; Johannessen, T; Westin, S

    1995-10-20

    Standardized patients were sent to general practitioners who use the patient list system in Trondheim in order to register daily clinical practice without the patient being unmasked. The authors explain what a standardized patient is, how they are taught to present a disease, and how they report on the consultation in a valid and reliable way. They also describe how the standardized patients were introduced into the doctors' patient list system. The doctors were informed about the project in advance. Twenty-three doctors were visited twice and one doctor was visited once by a standardized patient. At two of the visits the patient was unmasked. The conclusion is that the use of standardized patients is a valid, reliable and practical method for quality assurance in general practice in Norway. PMID:8539691

  18. A practical method to determine the heating and cooling curves of x-ray tube assemblies

    SciTech Connect

    Bottaro, M.; Moralles, M.; Viana, V.; Donatiello, G. L.; Silva, E. P.

    2007-10-15

    A practical method to determine the heating and cooling curves of x-ray tube assemblies with rotating anode x-ray tube is proposed. Available procedures to obtain these curves as described in the literature are performed during operation of the equipment, and the precision of the method depends on the knowledge of the total energy applied in the system. In the present work we describe procedures which use a calorimetric system and do not require the operation of the x-ray equipment. The method was applied successfully to a x-ray tube assembly that was under test in our laboratory.

  19. Structural and practical identifiability analysis of S-system.

    PubMed

    Zhan, Choujun; Li, Benjamin Yee Shing; Yeung, Lam Fat

    2015-12-01

    In the field of systems biology, biological reaction networks are usually modelled by ordinary differential equations. A sub-class, the S-systems representation, is a widely used form of modelling. Existing S-systems identification techniques assume that the system itself is always structurally identifiable. However, due to practical limitations, biological reaction networks are often only partially measured. In addition, the captured data only covers a limited trajectory, therefore data can only be considered as a local snapshot of the system responses with respect to the complete set of state trajectories over the entire state space. Hence the estimated model can only reflect partial system dynamics and may not be unique. To improve the identification quality, the structural and practical identifiablility of S-system are studied. The S-system is shown to be identifiable under a set of assumptions. Then, an application on yeast fermentation pathway was conducted. Two case studies were chosen; where the first case is based on a larger state trajectories and the second case is based on a smaller one. By expanding the dataset which span a relatively larger state space, the uncertainty of the estimated system can be reduced. The results indicated that initial concentration is related to the practical identifiablity. PMID:26577163

  20. Physical activity assessment in practice: a mixed methods study of GPPAQ use in primary care

    PubMed Central

    2014-01-01

    Background Insufficient physical activity (PA) levels which increase the risk of chronic disease are reported by almost two-thirds of the population. More evidence is needed about how PA promotion can be effectively implemented in general practice (GP), particularly in socio-economically disadvantaged communities. One tool recommended for the assessment of PA in GP and supported by NICE (National Institute for Health and Care Excellence) is The General Practice Physical Activity Questionnaire (GPPAQ) but details of how it may be used and of its acceptability to practitioners and patients are limited. This study aims to examine aspects of GPPAQ administration in non-urgent patient contacts using different primary care electronic recording systems and to explore the views of health professionals regarding its use. Methods Four general practices, selected because of their location within socio-economically disadvantaged areas, were invited to administer GPPAQs to patients, aged 35-75 years, attending non-urgent consultations, over two-week periods. They used different methods of administration and different electronic medical record systems (EMIS, Premiere, Vision). Participants’ (general practitioners (GPs), nurses and receptionists) views regarding GPPAQ use were explored via questionnaires and focus groups. Results Of 2,154 eligible consultations, 192 (8.9%) completed GPPAQs; of these 83 (43%) were categorised as inactive. All practices were located within areas ranked as being in the tertile of greatest socio-economic deprivation in Northern Ireland. GPs/nurses in two practices invited completion of the GPPAQ, receptionists did so in two. One practice used an electronic template; three used paper copies of the questionnaires. End-of-study questionnaires, completed by 11 GPs, 3 nurses and 2 receptionists and two focus groups, with GPs (n?=?8) and nurses (n?=?4) indicated that practitioners considered the GPPAQ easy to use but not in every consultation. Its use extended consultation time, particularly for patients with complex problems who could potentially benefit from PA promotion. Conclusions GPs and nurses reported that the GPPAQ itself was an easy tool with which to assess PA levels in general practice and feasible to use in a range of electronic record systems but integration within routine practice is constrained by time and complex consultations. Further exploration of ways to facilitate PA promotion into practice is needed. PMID:24422666

  1. Reform-based science teaching: A mixed-methods approach to explaining variation in secondary science teacher practice

    NASA Astrophysics Data System (ADS)

    Jetty, Lauren E.

    The purpose of this two-phase, sequential explanatory mixed-methods study was to understand and explain the variation seen in secondary science teachers' enactment of reform-based instructional practices. Utilizing teacher socialization theory, this mixed-methods analysis was conducted to determine the relative influence of secondary science teachers' characteristics, backgrounds and experiences across their teacher development to explain the range of teaching practices exhibited by graduates from three reform-oriented teacher preparation programs. Data for this study were obtained from the Investigating the Meaningfulness of Preservice Programs Across the Continuum of Teaching (IMPPACT) Project, a multi-university, longitudinal study funded by NSF. In the first quantitative phase of the study, data for the sample (N=120) were collected from three surveys from the IMPPACT Project database. Hierarchical multiple regression analysis was used to examine the separate as well as the combined influence of factors such as teachers' personal and professional background characteristics, beliefs about reform-based science teaching, feelings of preparedness to teach science, school context, school culture and climate of professional learning, and influences of the policy environment on the teachers' use of reform-based instructional practices. Findings indicate three blocks of variables, professional background, beliefs/efficacy, and local school context added significant contribution to explaining nearly 38% of the variation in secondary science teachers' use of reform-based instructional practices. The five variables that significantly contributed to explaining variation in teachers' use of reform-based instructional practices in the full model were, university of teacher preparation, sense of preparation for teaching science, the quality of professional development, science content focused professional, and the perceived level of professional autonomy. Using the results from phase one, the second qualitative phase selected six case study teachers based on their levels of reform-based teaching practices to highlight teachers across the range of practices from low, average, to high levels of implementation. Using multiple interview sources, phase two helped to further explain the variation in levels of reform-based practices. Themes related to teachers' backgrounds, local contexts, and state policy environments were developed as they related to teachers' socialization experiences across these contexts. The results of the qualitative analysis identified the following factors differentiating teachers who enacted reform-based instructional practices from those who did not: 1) extensive science research experiences prior to their preservice teacher preparation; 2) the structure and quality of their field placements; 3) developing and valuing a research-based understanding of teaching and learning as a result of their preservice teacher preparation experiences; 4) the professional culture of their school context where there was support for a high degree of professional autonomy and receiving support from "educational companions" with a specific focus on teacher pedagogy to support student learning; and 5) a greater sense of agency to navigate their districts' interpretation and implementation of state polices. Implications for key stakeholders as well as directions for future research are discussed.

  2. Concurrent implementation of the Crank-Nicolson method for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Ransom, J. B.; Fulton, R. E.

    1985-01-01

    To exploit the significant gains in computing speed provided by Multiple Instruction Multiple Data (MIMD) computers, concurrent methods for practical problems need to be investigated and test problems implemented on actual hardware. One such problem class is heat transfer analysis which is important in many aerospace applications. This paper compares the efficiency of two alternate implementations of heat transfer analysis on an experimental MIMD computer called the Finite Element Machine (FEM). The implicit Crank-Nicolson method is used to solve concurrently the heat transfer equations by both iterative and direct methods. Comparison of actual timing results achieved for the two methods and their significance relative to more complex problems are discussed.

  3. A Comparison of Low and High Structure Practice for Learning Interactional Analysis Skills

    ERIC Educational Resources Information Center

    Davis, Matthew James

    2011-01-01

    Innovative training approaches in work domains such as professional athletics, aviation, and the military have shown that specific types of practice can reliably lead to higher levels of performance for the average professional. This study describes the development of an initial effort toward creating a similar practice method for psychotherapy…

  4. Green analytical method development for statin analysis.

    PubMed

    Assassi, Amira Louiza; Roy, Claude-Eric; Perovitch, Philippe; Auzerie, Jack; Hamon, Tiphaine; Gaudin, Karen

    2015-02-01

    Green analytical chemistry method was developed for pravastatin, fluvastatin and atorvastatin analysis. HPLC/DAD method using ethanol-based mobile phase with octadecyl-grafted silica with various grafting and related-column parameters such as particle sizes, core-shell and monolith was studied. Retention, efficiency and detector linearity were optimized. Even for column with particle size under 2 ?m, the benefit of keeping efficiency within a large range of flow rate was not obtained with ethanol based mobile phase compared to acetonitrile one. Therefore the strategy to shorten analysis by increasing the flow rate induced decrease of efficiency with ethanol based mobile phase. An ODS-AQ YMC column, 50 mm × 4.6 mm, 3 ?m was selected which showed the best compromise between analysis time, statin separation, and efficiency. HPLC conditions were at 1 mL/min, ethanol/formic acid (pH 2.5, 25 mM) (50:50, v/v) and thermostated at 40°C. To reduce solvent consumption for sample preparation, 0.5mg/mL concentration of each statin was found the highest which respected detector linearity. These conditions were validated for each statin for content determination in high concentrated hydro-alcoholic solutions. Solubility higher than 100mg/mL was found for pravastatin and fluvastatin, whereas for atorvastatin calcium salt the maximum concentration was 2mg/mL for hydro-alcoholic binary mixtures between 35% and 55% of ethanol in water. Using atorvastatin instead of its calcium salt, solubility was improved. Highly concentrated solution of statins offered potential fluid for per Buccal Per-Mucous(®) administration with the advantages of rapid and easy passage of drugs. PMID:25582487

  5. Chapter 11. Community analysis-based methods

    SciTech Connect

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  6. Comparing the Effect of Concept Mapping and Conventional Methods on Nursing Students’ Practical Skill Score

    PubMed Central

    Rasoul Zadeh, Nasrin; Sadeghi Gandomani, Hamidreza; Delaram, Masoumeh; Parsa Yekta, Zohre

    2015-01-01

    Background: Development of practical skills in the field of nursing education has remained a serious and considerable challenge in nursing education. Moreover, newly graduated nurses may have weak practical skills, which can be a threat to patients’ safety. Objectives: The present study was conducted to compare the effect of concept mapping and conventional methods on nursing students’ practical skills. Patients and Methods: This quasi-experimental study was conducted on 70 nursing students randomly assigned into two groups of 35 people. The intervention group was taught through concept mapping method, while the control group was taught using conventional method. A two-part instrument was used including a demographic information form and a checklist for direct observation of procedural skills. Descriptive statistics, chi-square, independent samples t-tests and paired t-test were used to analyze data. Results: Before education, no significant differences were observed between the two groups in the three skills of cleaning (P = 0.251), injection (P = 0.185) and sterilizing (P = 0.568). The students mean scores were significantly increased after the education and the difference between pre and post intervention of students mean scores were significant in the both groups (P < 0.001). However, after education, in all three skills the mean scores of the intervention group were significantly higher than the control group (P < 0.001). Conclusions: Concept mapping was superior to conventional skill teaching methods. It is suggested to use concept mapping in teaching practical courses such as fundamentals of nursing. PMID:26576441

  7. Analysis of methods. [information systems evolution environment

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J. (editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity Relationship, Data Flow Diagrams, and Structure Charts, for inclusion in an integrated development support environment.

  8. Limitations in simulator time-based human reliability analysis methods

    SciTech Connect

    Wreathall, J.

    1989-01-01

    Developments in human reliability analysis (HRA) methods have evolved slowly. Current methods are little changed from those of almost a decade ago, particularly in the use of time-reliability relationships. While these methods were suitable as an interim step, the time (and the need) has come to specify the next evolution of HRA methods. As with any performance-oriented data source, power plant simulator data have no direct connection to HRA models. Errors reported in data are normal deficiencies observed in human performance; failures are events modeled in probabilistic risk assessments (PRAs). Not all errors cause failures; not all failures are caused by errors. Second, the times at which actions are taken provide no measure of the likelihood of failures to act correctly within an accident scenario. Inferences can be made about human reliability, but they must be made with great care. Specific limitations are discussed. Simulator performance data are useful in providing qualitative evidence of the variety of error types and their potential influences on operating systems. More work is required to combine recent developments in the psychology of error with the qualitative data collected at stimulators. Until data become openly available, however, such an advance will not be practical.

  9. EMQN Best Practice Guidelines for molecular and haematology methods for carrier identification and prenatal diagnosis of the haemoglobinopathies

    PubMed Central

    Traeger-Synodinos, Joanne; Harteveld, Cornelis L; Old, John M; Petrou, Mary; Galanello, Renzo; Giordano, Piero; Angastioniotis, Michael; De la Salle, Barbara; Henderson, Shirley; May, Alison

    2015-01-01

    Haemoglobinopathies constitute the commonest recessive monogenic disorders worldwide, and the treatment of affected individuals presents a substantial global disease burden. Carrier identification and prenatal diagnosis represent valuable procedures that identify couples at risk for having affected children, so that they can be offered options to have healthy offspring. Molecular diagnosis facilitates prenatal diagnosis and definitive diagnosis of carriers and patients (especially ‘atypical' cases who often have complex genotype interactions). However, the haemoglobin disorders are unique among all genetic diseases in that identification of carriers is preferable by haematological (biochemical) tests rather than DNA analysis. These Best Practice guidelines offer an overview of recommended strategies and methods for carrier identification and prenatal diagnosis of haemoglobinopathies, and emphasize the importance of appropriately applying and interpreting haematological tests in supporting the optimum application and evaluation of globin gene DNA analysis. PMID:25052315

  10. Application of The Method of Elastic Maps In Analysis of Genetic Texts A. N. GORBAN

    E-print Network

    Gorban, Alexander N.

    Application of The Method of Elastic Maps In Analysis of Genetic Texts A. N. GORBAN Institute molecular biology collect huge amounts of information that needs intelligent data mining. The basic property in the multidimensional space. Fig 1. Node, edge and rib Fig 2. Elastic nets used in practice Lets define elastic net

  11. New methods for sensitivity analysis of chaotic dynamical systems

    E-print Network

    Blonigan, Patrick Joseph

    2013-01-01

    Computational methods for sensitivity analysis are invaluable tools for fluid dynamics research and engineering design. These methods are used in many applications, including aerodynamic shape optimization and adaptive ...

  12. Determination of rate constants for trifluoromethyl radical addition to various alkenes via a practical method.

    PubMed

    Hartmann, M; Li, Y; Studer, A

    2016-01-01

    A simple and practical method for the determination of rate constants for trifluoromethyl radical addition to various alkenes by applying competition kinetics is introduced. In the kinetic experiments the trifluoromethyl radicals are generated in situ from a commercially available hypervalent-iodine-CF3 reagent (Togni-reagent) by SET-reduction with TEMPONa in the presence of TEMPO and a ?-acceptor. From the relative ratio of TEMPOCF3 and CF3-addition product formed, which is readily determined by (19)F-NMR spectroscopy, rate constants for trifluoromethyl radical addition to the ?-acceptor can be calculated. The practical method is also applicable to measure rate constants for the addition of other perfluoroalkyl radicals to alkenes as documented for CF3CF2-radical addition reactions. PMID:26574882

  13. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    NASA Technical Reports Server (NTRS)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  14. Safeguards systems analysis research and development and the practice of safeguards at DOE facilities

    SciTech Connect

    Zack, N.R.; Thomas, K.E.; Markin, J.T.; Tape, J.W.

    1991-01-01

    Los Alamos Safeguards Systems Group personnel interact with Department of Energy (DOE) nuclear materials processing facilities in a number of ways. Among them are training courses, formal technical assistance such as developing information management or data analysis software and informal ad hoc assistance especially in reviewing and commenting on existing facility safeguards technology and procedures. These activities are supported by the DOE Office of Safeguards and Security, DOE Operations Offices, and contractor organizations. Because of the relationships with the Operations Office and facility personnel, the Safeguards Systems Group research and development (R D) staff have developed an understanding of the needs of the entire complex. Improved safeguards are needed in areas such as materials control activities, accountability procedures and techniques, systems analysis and evaluation methods, and material handling procedures. This paper surveys the generic needs for efficient and cost effective enhancements in safeguards technologies and procedures at DOE facilities, identifies areas where existing safeguards R D products are being applied or could be applied, and sets a direction for future systems analysis R D to address practical facility safeguards needs.

  15. Analysis of two methods to evaluate antioxidants.

    PubMed

    Tomasina, Florencia; Carabio, Claudio; Celano, Laura; Thomson, Leonor

    2012-07-01

    This exercise is intended to introduce undergraduate biochemistry students to the analysis of antioxidants as a biotechnological tool. In addition, some statistical resources will also be used and discussed. Antioxidants play an important metabolic role, preventing oxidative stress-mediated cell and tissue injury. Knowing the antioxidant content of nutritional components can help make informed decisions about diet design, and increase the commercial value of antioxidant-rich natural products. As a reliable and convenient technique to evaluate the whole spectrum of antioxidants present in biological samples is lacking, the general consensus is to use more than one technique. We have chosen two widely used and inexpensive methods, Trolox-equivalent antioxidant capacity and the ferric reducing antioxidant power assays, to evaluate the antioxidant content of several fruits, and to compare and analyze the correlation between both assays. PMID:22807430

  16. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton (Albuquerque, NM); Phillips, Cynthia A. (Albuquerque, NM)

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  17. Mapping Cigarettes Similarities using Cluster Analysis Methods

    PubMed Central

    Bolboac?, Sorana D.; Jäntschi, Lorentz

    2007-01-01

    The aim of the research was to investigate the relationship and/or occurrences in and between chemical composition information (tar, nicotine, carbon monoxide), market information (brand, manufacturer, price), and public health information (class, health warning) as well as clustering of a sample of cigarette data. A number of thirty cigarette brands have been analyzed. Six categorical (cigarette brand, manufacturer, health warnings, class) and four continuous (tar, nicotine, carbon monoxide concentrations and package price) variables were collected for investigation of chemical composition, market information and public health information. Multiple linear regression and two clusterization techniques have been applied. The study revealed interesting remarks. The carbon monoxide concentration proved to be linked with tar and nicotine concentration. The applied clusterization methods identified groups of cigarette brands that shown similar characteristics. The tar and carbon monoxide concentrations were the main criteria used in clusterization. An analysis of a largest sample could reveal more relevant and useful information regarding the similarities between cigarette brands. PMID:17911663

  18. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R. (Albuquerque, NM)

    2010-11-23

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  19. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-05-25

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  20. Effect of practice management softwares among physicians of developing countries with special reference to Indian scenario by Mixed Method Technique

    PubMed Central

    Davey, Sanjeev; Davey, Anuradha

    2015-01-01

    Introduction: Currently, many cheaper “practice management software” (PMS) are available in developing countries including India; despite their availability and benefits, its penetration and usage vary from low to moderate level, justifying the importance of this study area. Materials and Methods: First preferred reporting items for systematic-review and meta-analysis (2009) guidelines were considered; followed by an extensive systematic-review of available studies in literature related to developing countries, on key search term from main abstracting databases: PubMed, EMBASE, EBSCO, BIO-MED Central, Cochrane Library, world CAT-library till 15 June 2014; where any kind of article whether published or unpublished, in any sort or form or any language indicating the software usage were included. Thereafter, meta-analysis on Indian studies revealing the magnitude of usage in Indian scenario by Open Meta-(analyst) software using binary random effects (REs) model was done. Studies from developed countries were excluded in our study. Results: Of 57 studies included in a systematic review from developing countries, only 4 Indian studies were found eligible for meta-analysis. RE model revealed although not-significant results (total participants = 243,526; range: 100–226,228, overall odds ratio = 2.85, 95% confidence interval = P < 0.05 and tests for heterogeneity: Q [df = 3] = 0.8 Het. P = 0.85). The overall magnitude of usage of PMS on Indian physicians practice was however found between 10% and 45%. Conclusion: Although variable and nonsignificant effect of usage of PM software on practice of physicians in developing countries like India was found; there is a need to recognize the hidden potential of this system. Hence, more in-depth research in future needs to be done, in order to find a real impact of this system. PMID:25949969

  1. Practical applications of nondestructive evaluation for airport pavement analysis

    NASA Astrophysics Data System (ADS)

    McQueen, Roy D.; Guo, Edward

    1995-07-01

    This paper discusses the equipment and methodologies currently used for nondestructive testing (NDT) and nondestructive evaluation (NDE) of the structural capacity of military and civil airport pavements, including: (1) commonly used equipment and test methods for measuring pavement response to dynamic loads; (2) qualitative and quantitative evaluation of NDT data; (3) methods for back-calculating layer properties from NDT data; (4) layered elastic methods for evaluating pavement performance using processed NDT data; and (5) application of analytical results for developing pavement rehabilitation and management strategies.

  2. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Marketing Service, Dairy Programs, or the Official Methods of Analysis of the Association of Official... 7 Agriculture 3 2011-01-01 2011-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable...

  3. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Marketing Service, Dairy Programs, or the Official Methods of Analysis of the Association of Official... 7 Agriculture 3 2010-01-01 2010-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable...

  4. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... “Official Methods of Analysis of the Association of Official Analytical Chemists,” 13th ed., 1980, which is... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Methods of analysis. 133.5 Section 133.5 Food and... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis....

  5. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Marketing Service, Dairy Programs, or the Official Methods of Analysis of the Association of Official... 7 Agriculture 3 2013-01-01 2013-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable...

  6. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Marketing Service, Dairy Programs, or the Official Methods of Analysis of the Association of Official... 7 Agriculture 3 2012-01-01 2012-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable...

  7. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Marketing Service, Dairy Programs, or the Official Methods of Analysis of the Association of Official... 7 Agriculture 3 2014-01-01 2014-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable...

  8. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... “Official Methods of Analysis of the Association of Official Analytical Chemists,” 13th ed., 1980, which is... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Methods of analysis. 133.5 Section 133.5 Food and... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis....

  9. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... “Official Methods of Analysis of the Association of Official Analytical Chemists,” 13th ed., 1980, which is... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Methods of analysis. 133.5 Section 133.5 Food and... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis....

  10. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... “Official Methods of Analysis of the Association of Official Analytical Chemists,” 13th ed., 1980, which is... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Methods of analysis. 133.5 Section 133.5 Food and... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis....

  11. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...false Methods of analysis. 2.19 Section 2.19 Food and Drugs FOOD...Where the method of analysis is not prescribed...the policy of the Food and Drug Administration...utilize the methods of analysis of the AOAC...

  12. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...false Methods of analysis. 2.19 Section 2.19 Food and Drugs FOOD...Where the method of analysis is not prescribed...the policy of the Food and Drug Administration...utilize the methods of analysis of the AOAC...

  13. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...false Methods of analysis. 2.19 Section 2.19 Food and Drugs FOOD...Where the method of analysis is not prescribed...the policy of the Food and Drug Administration...utilize the methods of analysis of the AOAC...

  14. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...false Methods of analysis. 2.19 Section 2.19 Food and Drugs FOOD...Where the method of analysis is not prescribed...the policy of the Food and Drug Administration...utilize the methods of analysis of the AOAC...

  15. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...false Methods of analysis. 2.19 Section 2.19 Food and Drugs FOOD...Where the method of analysis is not prescribed...the policy of the Food and Drug Administration...utilize the methods of analysis of the AOAC...

  16. MATH 5620 NUMERICAL ANALYSIS II PRACTICE MIDTERM EXAM

    E-print Network

    Guevara-Vasquez, Fernando

    , . . . , Um]T and A comes from the usual three point stencil finite differences approximation to uxx. (b-linear shooting method for solving a non-linear BVP with mixed type boundary conditions: y = f(t, y, y with the points xi = ih, i = 0, . . . , m + 1, where h = 1/(m + 1) = x. Let Ui(t) u(xi, t). Use the method

  17. Alignment of patient and primary care practice member perspectives of chronic illness care: a cross-sectional analysis

    PubMed Central

    2014-01-01

    Background Little is known as to whether primary care teams’ perceptions of how well they have implemented the Chronic Care Model (CCM) corresponds with their patients’ own experience of chronic illness care. We examined the extent to which practice members’ perceptions of how well they organized to deliver care consistent with the CCM were associated with their patients’ perceptions of the chronic illness care they have received. Methods Analysis of baseline measures from a cluster randomized controlled trial testing a practice facilitation intervention to implement the CCM in small, community-based primary care practices. All practice “members” (i.e., physician providers, non-physician providers, and staff) completed the Assessment of Chronic Illness Care (ACIC) survey and adult patients with 1 or more chronic illnesses completed the Patient Assessment of Chronic Illness Care (PACIC) questionnaire. Results Two sets of hierarchical linear regression models accounting for nesting of practice members (N?=?283) and patients (N?=?1,769) within 39 practices assessed the association between practice member perspectives of CCM implementation (ACIC scores) and patients’ perspectives of CCM (PACIC). ACIC summary score was not significantly associated with PACIC summary score or most of PACIC subscale scores, but four of the ACIC subscales [Self-management Support (p?practice member perspectives when evaluating quality of chronic illness care. Trial registration NCT00482768 PMID:24678983

  18. Bearing capacity analysis using the method of characteristics

    NASA Astrophysics Data System (ADS)

    Sun, Jian-Ping; Zhao, Zhi-Ye; Cheng, Yi-Pik

    2013-04-01

    Using the method of characteristics, the bearing capacity for a strip footing is analyzed. The method of characteristics leads to an exact true limit load when the calculations of the three terms in the bearing capacity formula are consistent with one collapse mechanism and the soil satisfies the associated flow rule. At the same time, the method of characteristics avoids the assumption of arbitrary slip surfaces, and produces zones within which equilibrium and plastic yield are simultaneously satisfied for given boundary stresses. The exact solution without superposition approximation can still be expressed by Terzaghi's equation of bearing capacity, in which the bearing capacity factor N ?? is dependent on the dimensionless parameter ? and the friction angle ?. The influence of groundwater on the bearing capacity of the shallow strip footing is considered, which indicates that when the groundwater effect is taken into account, the error induced by the superposition approximation can be reduced as compared with dry soil condition. The results are presented in the form of charts which give the modified value (N_{? ^{? _c } }^W /N_{? ^{? _c } } ) of bearing capacity factor. Finally, an approximated analytical expression, which provides results in close agreement with those obtained by numerical analysis in this paper, has been suggested for practical application purposes.

  19. Practical method using superposition of individual magnetic fields for initial arrangement of undulator magnets

    SciTech Connect

    Tsuchiya, K.; Shioya, T.

    2015-04-15

    We have developed a practical method for determining an excellent initial arrangement of magnetic arrays for a pure-magnet Halbach-type undulator. In this method, the longitudinal magnetic field distribution of each magnet is measured using a moving Hall probe system along the beam axis with a high positional resolution. The initial arrangement of magnetic arrays is optimized and selected by analyzing the superposition of all distribution data in order to achieve adequate spectral quality for the undulator. We applied this method to two elliptically polarizing undulators (EPUs), called U#16-2 and U#02-2, at the Photon Factory storage ring (PF ring) in the High Energy Accelerator Research Organization (KEK). The measured field distribution of the undulator was demonstrated to be excellent for the initial arrangement of the magnet array, and this method saved a great deal of effort in adjusting the magnetic fields of EPUs.

  20. Initial analysis of space target's stealth methods at laser wavelengths

    NASA Astrophysics Data System (ADS)

    Du, Haitao; Han, Yi; Sun, Huayan; Zhang, Tinghua

    2014-12-01

    The laser stealth of space target is useful, important and urgent in practice. This paper introduces the definition expression of laser radar cross section (LRCS) and the general laws of the influencing factors of space target's LRCS, including surface materials types, target's shape and size. Then this paper discusses the possible laser stealth methods of space target in practical applications from the two view points of material stealth methods and shape stealth methods. These conclusions and suggestions can provide references for the next research thinking and methods of the target's laser stealth.

  1. Visceral fat estimation method by bioelectrical impedance analysis and causal analysis

    NASA Astrophysics Data System (ADS)

    Nakajima, Hiroshi; Tasaki, Hiroshi; Tsuchiya, Naoki; Hamaguchi, Takehiro; Shiga, Toshikazu

    2011-06-01

    It has been clarified that abdominal visceral fat accumulation is closely associated to the lifestyle disease and metabolic syndrome. The gold standard in medical fields is visceral fat area measured by an X-ray computer tomography (CT) scan or magnetic resonance imaging. However, their measurements are high invasive and high cost; especially a CT scan causes X-ray exposure. They are the reasons why medical fields need an instrument for viscera fat measurement with low invasive, ease of use, and low cost. The article proposes a simple and practical method of visceral fat estimation by employing bioelectrical impedance analysis and causal analysis. In the method, abdominal shape and dual impedances of abdominal surface and body total are measured to estimate a visceral fat area based on the cause-effect structure. The structure is designed according to the nature of abdominal body composition to be fine-tuned by statistical analysis. The experiments were conducted to investigate the proposed model. 180 subjects were hired to be measured by both a CT scan and the proposed method. The acquired model explained the measurement principle well and the correlation coefficient is 0.88 with the CT scan measurements.

  2. Newborn Hearing Screening: An Analysis of Current Practices

    ERIC Educational Resources Information Center

    Houston, K. Todd; Bradham, Tamala S.; Munoz, Karen F.; Guignard, Gayla Hutsell

    2011-01-01

    State coordinators of early hearing detection and intervention (EHDI) programs completed a strengths, weaknesses, opportunities, and threats, or SWOT, analysis that consisted of 12 evaluative areas of EHDI programs. For the newborn hearing screening area, a total of 293 items were listed by 49 EHDI coordinators, and themes were identified within…

  3. Suspension, Race, and Disability: Analysis of Statewide Practices and Reporting

    ERIC Educational Resources Information Center

    Krezmien, Michael P.; Leone, Peter E.; Achilles, Georgianna M.

    2006-01-01

    This analysis of statewide suspension data from 1995 to 2003 in Maryland investigated disproportionate suspensions of minority students and students with disabilities. We found substantial increases in over-all rates of suspensions from 1995 to 2003, as well as disproportionate rates of suspensions for African American students, American Indian…

  4. The Analysis of Athletic Performance: Some Practical and Philosophical Considerations

    ERIC Educational Resources Information Center

    Nelson, Lee J.; Groom, Ryan

    2012-01-01

    This article presents a hypothetical dialogue between a notational analyst (NA) recently schooled in the positivistic assessment of athletic performance, an "old-school" traditional coach (TC) who favours subjective analysis, and a pragmatic educator (PE). The conversation opens with NA and TC debating the respective value of quantitative and…

  5. Integrated force method versus displacement method for finite element analysis

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Berke, Laszlo; Gallagher, Richard H.

    1990-01-01

    A novel formulation termed the integrated force method (IFM) has been developed in recent years for analyzing structures. In this method all the internal forces are taken as independent variables, and the system equilibrium equations (EE's) are integrated with the global compatibility conditions (CC's) to form the governing set of equations. In IFM the CC's are obtained from the strain formulation of St. Venant, and no choices of redundant load systems have to be made, in constrast to the standard force method (SFM). This property of IFM allows the generation of the governing equation to be automated straightforwardly, as it is in the popular stiffness method (SM). In this report IFM and SM are compared relative to the structure of their respective equations, their conditioning, required solution methods, overall computational requirements, and convergence properties as these factors influence the accuracy of the results. Overall, this new version of the force method produces more accurate results than the stiffness method for comparable computational cost.

  6. Three-dimensional Stress Analysis Using the Boundary Element Method

    NASA Technical Reports Server (NTRS)

    Wilson, R. B.; Banerjee, P. K.

    1984-01-01

    The boundary element method is to be extended (as part of the NASA Inelastic Analysis Methods program) to the three-dimensional stress analysis of gas turbine engine hot section components. The analytical basis of the method (as developed in elasticity) is outlined, its numerical implementation is summarized, and the approaches to be followed in extending the method to include inelastic material response indicated.

  7. [Bloodstain pattern analysis on examples from practice: Are calculations with application parabolic trajectory usable?].

    PubMed

    Makovický, Peter; Matlach, Radek; Pokorná, Olga; Mošna, František; Makovický, Pavol

    2015-01-01

    The bloodstain pattern analysis (BPA) is useful in the forensic medicine. In Czechoslovakian criminology is this method not commonly used. The objective of this work is to calculate the impact length, height and distance splashing of blood drops. The results are compared with the real values for specific cases. It is also compared to calculate the angle of incidence of blood drops, using sin? with a form using tg?. For this purposes we used two different character cases from practice with well-preserved condition and readable blood stains. Selected blood stains were documented in order to calculate the angle of incidence of blood drops and to calculateorigin splashes. For this drop of blood, the distance of impact of the drops of blood (x), the height of the sprayed blood drops (y) and the length of the flight path the drop of blood (l). The obtained data was retrospectively analysed for the two models. The first straight line is represented by the triangle (M1) and the other is the parabolic model (M2). The formulae were derived using the Euler substitution. The results show that the angle of incidence of the drop of blood can be calculated as sin? and the tg?. When applying, the triangle is appropriate to consider the application and sin? parabolic requires the calculation of the angle of incidence drops of blood tg?. Parabola is useful for the BPA. In Czechoslovakian should be providing workplace training seminars BPA primarily intended for forensic investigators.We recommend the use of this method during investigations, verification of acts in forensic practice. PMID:26585307

  8. Integration of Formal Job Hazard Analysis & ALARA Work Practice

    SciTech Connect

    NELSEN, D.P.

    2002-09-01

    ALARA work practices have traditionally centered on reducing radiological exposure and controlling contamination. As such, ALARA policies and procedures are not well suited to a wide range of chemical and human health issues. Assessing relative risk, identifying appropriate engineering/administrative controls and selecting proper Personal Protective Equipment (PPE) for non nuclear work activities extends beyond the limitations of traditional ALARA programs. Forging a comprehensive safety management program in today's (2002) work environment requires a disciplined dialog between health and safety professionals (e.g. safety, engineering, environmental, quality assurance, industrial hygiene, ALARA, etc.) and personnel working in the field. Integrating organizational priorities, maintaining effective pre-planning of work and supporting a team-based approach to safety management represents today's hallmark of safety excellence. Relying on the mandates of any single safety program does not provide industrial hygiene with the tools necessary to implement an integrated safety program. The establishment of tools and processes capable of sustaining a comprehensive safety program represents a key responsibility of industrial hygiene. Fluor Hanford has built integrated safety management around three programmatic attributes: (1) Integration of radiological, chemical and ergonomic issues under a single program. (2) Continuous improvement in routine communications among work planning/scheduling, job execution and management. (3) Rapid response to changing work conditions, formalized work planning and integrated worker involvement.

  9. Methods for analysis of fluoroquinolones in biological fluids

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Methods for analysis of 10 selected fluoroquinolone antibiotics in biological fluids are reviewed. Approaches for sample preparation, detection methods, limits of detection and quantitation and recovery information are provided for both single analyte and multi-analyte fluoroquinolone methods....

  10. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  11. Benthic macroinvertebrates in lake ecological assessment: A review of methods, intercalibration and practical recommendations.

    PubMed

    Poikane, Sandra; Johnson, Richard K; Sandin, Leonard; Schartau, Ann Kristin; Solimini, Angelo G; Urbani?, Gorazd; Arba?iauskas, K?stutis; Aroviita, Jukka; Gabriels, Wim; Miler, Oliver; Pusch, Martin T; Timm, Henn; Böhmer, Jürgen

    2016-02-01

    Legislation in Europe has been adopted to determine and improve the ecological integrity of inland and coastal waters. Assessment is based on four biotic groups, including benthic macroinvertebrate communities. For lakes, benthic invertebrates have been recognized as one of the most difficult organism groups to use in ecological assessment, and hitherto their use in ecological assessment has been limited. In this study, we review and intercalibrate 13 benthic invertebrate-based tools across Europe. These assessment tools address different human impacts: acidification (3 methods), eutrophication (3 methods), morphological alterations (2 methods), and a combination of the last two (5 methods). For intercalibration, the methods were grouped into four intercalibration groups, according to the habitat sampled and putative pressure. Boundaries of the 'good ecological status' were compared and harmonized using direct or indirect comparison approaches. To enable indirect comparison of the methods, three common pressure indices and two common biological multimetric indices were developed for larger geographical areas. Additionally, we identified the best-performing methods based on their responsiveness to different human impacts. Based on these experiences, we provide practical recommendations for the development and harmonization of benthic invertebrate assessment methods in lakes and similar habitats. PMID:26580734

  12. Measuring Racial/Ethnic Disparities in Health Care: Methods and Practical Issues

    PubMed Central

    Cook, Benjamin Lê; McGuire, Thomas G; Zaslavsky,, Alan M

    2012-01-01

    Objective To review methods of measuring racial/ethnic health care disparities. Study Design Identification and tracking of racial/ethnic disparities in health care will be advanced by application of a consistent definition and reliable empirical methods. We have proposed a definition of racial/ethnic health care disparities based in the Institute of Medicine's (IOM) Unequal Treatment report, which defines disparities as all differences except those due to clinical need and preferences. After briefly summarizing the strengths and critiques of this definition, we review methods that have been used to implement it. We discuss practical issues that arise during implementation and expand these methods to identify sources of disparities. We also situate the focus on methods to measure racial/ethnic health care disparities (an endeavor predominant in the United States) within a larger international literature in health outcomes and health care inequality. Empirical Application We compare different methods of implementing the IOM definition on measurement of disparities in any use of mental health care and mental health care expenditures using the 2004–2008 Medical Expenditure Panel Survey. Conclusion Disparities analysts should be aware of multiple methods available to measure disparities and their differing assumptions. We prefer a method concordant with the IOM definition. PMID:22353147

  13. The influence of deliberate practice on musical achievement: a meta-analysis.

    PubMed

    Platz, Friedrich; Kopiez, Reinhard; Lehmann, Andreas C; Wolf, Anna

    2014-01-01

    Deliberate practice (DP) is a task-specific structured training activity that plays a key role in understanding skill acquisition and explaining individual differences in expert performance. Relevant activities that qualify as DP have to be identified in every domain. For example, for training in classical music, solitary practice is a typical training activity during skill acquisition. To date, no meta-analysis on the quantifiable effect size of deliberate practice on attained performance in music has been conducted. Yet the identification of a quantifiable effect size could be relevant for the current discussion on the role of various factors on individual difference in musical achievement. Furthermore, a research synthesis might enable new computational approaches to musical development. Here we present the first meta-analysis on the role of deliberate practice in the domain of musical performance. A final sample size of 13 studies (total N = 788) was carefully extracted to satisfy the following criteria: reported durations of task-specific accumulated practice as predictor variables and objectively assessed musical achievement as the target variable. We identified an aggregated effect size of r c = 0.61; 95% CI [0.54, 0.67] for the relationship between task-relevant practice (which by definition includes DP) and musical achievement. Our results corroborate the central role of long-term (deliberate) practice for explaining expert performance in music. PMID:25018742

  14. The influence of deliberate practice on musical achievement: a meta-analysis

    PubMed Central

    Platz, Friedrich; Kopiez, Reinhard; Lehmann, Andreas C.; Wolf, Anna

    2014-01-01

    Deliberate practice (DP) is a task-specific structured training activity that plays a key role in understanding skill acquisition and explaining individual differences in expert performance. Relevant activities that qualify as DP have to be identified in every domain. For example, for training in classical music, solitary practice is a typical training activity during skill acquisition. To date, no meta-analysis on the quantifiable effect size of deliberate practice on attained performance in music has been conducted. Yet the identification of a quantifiable effect size could be relevant for the current discussion on the role of various factors on individual difference in musical achievement. Furthermore, a research synthesis might enable new computational approaches to musical development. Here we present the first meta-analysis on the role of deliberate practice in the domain of musical performance. A final sample size of 13 studies (total N = 788) was carefully extracted to satisfy the following criteria: reported durations of task-specific accumulated practice as predictor variables and objectively assessed musical achievement as the target variable. We identified an aggregated effect size of rc = 0.61; 95% CI [0.54, 0.67] for the relationship between task-relevant practice (which by definition includes DP) and musical achievement. Our results corroborate the central role of long-term (deliberate) practice for explaining expert performance in music. PMID:25018742

  15. New practice guidelines aim to put teeth in the root cause analysis process .

    PubMed

    2015-09-01

    While hospitals have been using root cause analysis (RCA) to identify the reasons for problems and errors for many years, experts note that the results of these efforts have been uneven at best. To improve the RCA process, a team of experts from the National Patient Safety Foundation (NPSF) have assembled best-practice guidelines to both standardize the RCA process and guide organizations in their improvement efforts. Further, they have renamed the process RCA squared or RCA2 to emphasize the need for action steps once an analysis is completed. Report authors say prioritization methods need to be devised so that near-misses and close calls receive more attention from RCA2 teams. RCA2 teams should be nimble, including four to six members, one of whom is a patient representative. When problems or errors emerge, the RCA2 process should commence within 72 hours, and the RCA2 team should complete its investigative work in 30 to 45 days. Experts say causal statements should outline what the solutions to a problem or error should be. PMID:26389153

  16. Practical exergy analysis of centrifugal compressor performance using ASME-PTC-10 data

    SciTech Connect

    Carranti, F.J.

    1997-07-01

    It has been shown that measures of performance currently in use for industrial and process compressors do not give a true measure of energy utilization, and that the required assumptions of isentropic or adiabatic behavior are now always valid. A better indication of machine or process performance can be achieved using exergetic (second law) efficiencies and by employing the second law of thermodynamics to indicate the nature of irreversibilities and entropy generation in the compression process. In this type of analysis, performance is related to an environmental equilibrium condition, or dead state. Often, the differences between avoidable and unavoidable irreversibilities ca be interpreted from these results. A general overview of the techniques involved in exergy analysis as applied to compressors and blowers is presented. A practical method to allow the calculation of exergetic efficiencies by manufacturers and end users is demonstrated using data from ASME Power Test Code input. These data are often readily available from compressor manufacturers for both design and off-design conditions, or can sometimes be obtained from field measurements. The calculations involved are simple and straightforward, and can demonstrate the energy usage situation for a variety of conditions. Here off-design is taken to mean at different rates of flow, as well as at different environmental states. The techniques presented are also applicable to many other equipment and process types.

  17. Standard practices for dissolving glass containing radioactive and mixed waste for chemical and radiochemical analysis

    E-print Network

    American Society for Testing and Materials. Philadelphia

    2000-01-01

    1.1 These practices cover techniques suitable for dissolving glass samples that may contain nuclear wastes. These techniques used together or independently will produce solutions that can be analyzed by inductively coupled plasma atomic emission spectroscopy (ICP-AES), inductively coupled plasma mass spectrometry (ICP-MS), atomic absorption spectrometry (AAS), radiochemical methods and wet chemical techniques for major components, minor components and radionuclides. 1.2 One of the fusion practices and the microwave practice can be used in hot cells and shielded hoods after modification to meet local operational requirements. 1.3 The user of these practices must follow radiation protection guidelines in place for their specific laboratories. 1.4 Additional information relating to safety is included in the text. 1.5 The dissolution techniques described in these practices can be used for quality control of the feed materials and the product of plants vitrifying nuclear waste materials in glass. 1.6 These pr...

  18. Who's in and why? A typology of stakeholder analysis methods for natural resource management.

    PubMed

    Reed, Mark S; Graves, Anil; Dandy, Norman; Posthumus, Helena; Hubacek, Klaus; Morris, Joe; Prell, Christina; Quinn, Claire H; Stringer, Lindsay C

    2009-04-01

    Stakeholder analysis means many things to different people. Various methods and approaches have been developed in different fields for different purposes, leading to confusion over the concept and practice of stakeholder analysis. This paper asks how and why stakeholder analysis should be conducted for participatory natural resource management research. This is achieved by reviewing the development of stakeholder analysis in business management, development and natural resource management. The normative and instrumental theoretical basis for stakeholder analysis is discussed, and a stakeholder analysis typology is proposed. This consists of methods for: i) identifying stakeholders; ii) differentiating between and categorising stakeholders; and iii) investigating relationships between stakeholders. The range of methods that can be used to carry out each type of analysis is reviewed. These methods and approaches are then illustrated through a series of case studies funded through the Rural Economy and Land Use (RELU) programme. These case studies show the wide range of participatory and non-participatory methods that can be used, and discuss some of the challenges and limitations of existing methods for stakeholder analysis. The case studies also propose new tools and combinations of methods that can more effectively identify and categorise stakeholders and help understand their inter-relationships. PMID:19231064

  19. Degradation of learned skills. Effectiveness of practice methods on simulated space flight skill retention

    NASA Technical Reports Server (NTRS)

    Sitterley, T. E.; Berge, W. A.

    1972-01-01

    Manual flight control and emergency procedure task skill degradation was evaluated after time intervals of from 1 to 6 months. The tasks were associated with a simulated launch through the orbit insertion flight phase of a space vehicle. The results showed that acceptable flight control performance was retained for 2 months, rapidly deteriorating thereafter by a factor of 1.7 to 3.1 depending on the performance measure used. Procedural task performance showed unacceptable degradation after only 1 month, and exceeded an order of magnitude after 4 months. The effectiveness of static rehearsal (checklists and briefings) and dynamic warmup (simulator practice) retraining methods were compared for the two tasks. Static rehearsal effectively countered procedural skill degradation, while some combination of dynamic warmup appeared necessary for flight control skill retention. It was apparent that these differences between methods were not solely a function of task type or retraining method, but were a function of the performance measures used for each task.

  20. The uniform asymptotic swallowtail approximation - Practical methods for oscillating integrals with four coalescing saddle points

    NASA Technical Reports Server (NTRS)

    Connor, J. N. L.; Curtis, P. R.; Farrelly, D.

    1984-01-01

    Methods that can be used in the numerical implementation of the uniform swallowtail approximation are described. An explicit expression for that approximation is presented to the lowest order, showing that there are three problems which must be overcome in practice before the approximation can be applied to any given problem. It is shown that a recently developed quadrature method can be used for the accurate numerical evaluation of the swallowtail canonical integral and its partial derivatives. Isometric plots of these are presented to illustrate some of their properties. The problem of obtaining the arguments of the swallowtail integral from an analytical function of its argument is considered, describing two methods of solving this problem. The asymptotic evaluation of the butterfly canonical integral is addressed.

  1. Echinacea purpurea: Pharmacology, phytochemistry and analysis methods

    PubMed Central

    Manayi, Azadeh; Vazirian, Mahdi; Saeidnia, Soodabeh

    2015-01-01

    Echinacea purpurea (Asteraceae) is a perennial medicinal herb with important immunostimulatory and anti-inflammatory properties, especially the alleviation of cold symptoms. The plant also attracted scientists’ attention to assess other aspects of its beneficial effects. For instance, antianxiety, antidepression, cytotoxicity, and antimutagenicity as induced by the plant have been revealed in various studies. The findings of the clinical trials are controversial in terms of side effects. While some studies revealed the beneficial effects of the plant on the patients and no severe adverse effects, some others have reported serious side effects including abdominal pain, angioedema, dyspnea, nausea, pruritus, rash, erythema, and urticaria. Other biological activities of the plant such as antioxidant, antibacterial, antiviral, and larvicidal activities have been reported in previous experimental studies. Different classes of secondary metabolites of the plant such as alkamides, caffeic acid derivatives, polysaccharides, and glycoproteins are believed to be biologically and pharmacologically active. Actually, concurrent determination and single analysis of cichoric acid and alkamides have been successfully developed mainly by using high-performance liquid chromatography (HPLC) coupled with different detectors including UV spectrophotometric, coulometric electrochemical, and electrospray ionization mass spectrometric detectors. The results of the studies which were controversial revealed that in spite of major experiments successfully accomplished using E. purpurea, many questions remain unanswered and future investigations may aim for complete recognition of the plant's mechanism of action using new, complementary methods. PMID:26009695

  2. Why and How Do Nursing Homes Implement Culture Change Practices? Insights from Qualitative Interviews in a Mixed Methods Study

    PubMed Central

    Shield, Renée R.; Looze, Jessica; Tyler, Denise; Lepore, Michael; Miller, Susan C.

    2015-01-01

    Objective To understand the process of instituting culture change (CC) practices in nursing homes (NHs). Methods NH Directors of Nursing (DONs) and Administrators (NHAs) at 4,149 United States NHs were surveyed about CC practices. Follow-up interviews with 64 NHAs were conducted and analyzed by a multidisciplinary team which reconciled interpretations recorded in an audit trail. Results The themes include: 1) Reasons for implementing CC practices vary; 2) NH approaches to implementing CC practices are diverse; 3) NHs consider resident mix in deciding to implement practices; 4) NHAs note benefits and few implementation costs of implementing CC practices; 5) Implementation of changes is challenging and strategies for change are tailored to the challenges encountered; 6) Education and communication efforts are vital ways to institute change; and 7) NHA and other staff leadership is key to implementing changes. Discussion Diverse strategies and leadership skills appear to help NHs implement reform practices, including CC innovations. PMID:24652888

  3. [The prospects for the application of the immunohistochemical methods for the establishment of intravitality and prescription of the mechanical injuries in forensic medical practice].

    PubMed

    Bogomolov, D V; Bogomolova, I N; Zavalishina, L É; Kovalev, A V; Kul'bitski?, B N; Fedulova, M V

    2014-01-01

    The objective of the present work was the analysis of the literature concerning the application of the immunohistochemical methods for the improvement of diagnostics of intravitality and prescription of the mechanical injuries in forensic medical practice. Special attention is given to the examples of publication dealing with the methods for addressing this issue. The most promising areas of the application of immunohistochemical methods are considered. They are exemplified by the use of specific antibodies for the establishment of intravitality and prescription of the mechanical injuries. The possibility of using the presence of fibrinogen in the pulmonary alveoli as the marker of prolonged strangulation is illustrated. The results of this literature review provided a basis for the conclusion about good prospects of the application of the immunohistochemical methods with the purpose of establishing intravitality and prescription of the mechanical injuries in forensic medical practice. PMID:25764882

  4. Primary prevention in general practice – views of German general practitioners: a mixed-methods study

    PubMed Central

    2014-01-01

    Background Policy efforts focus on a reorientation of health care systems towards primary prevention. To guide such efforts, we analyzed the role of primary prevention in general practice and general practitioners’ (GPs) attitudes toward primary prevention. Methods Mixed-method study including a cross-sectional survey of all community-based GPs and focus groups in a sample of GPs who collaborated with the Institute of General Practice in Berlin, Germany in 2011. Of 1168 GPs 474 returned the mail survey. Fifteen GPs participated in focus group discussions. Survey and interview guidelines were developed and tested to assess and discuss beliefs, attitudes, and practices regarding primary prevention. Results Most respondents considered primary prevention within their realm of responsibility (70%). Primary prevention, especially physical activity, healthy eating, and smoking cessation, was part of the GPs’ health care recommendations if they thought it was indicated. Still a quarter of survey respondents discussed reduction of alcohol consumption with their patients infrequently even when they thought it was indicated. Similarly 18% claimed that they discuss smoking cessation only sometimes. The focus groups revealed that GPs were concerned about the detrimental effects an uninvited health behavior suggestion could have on patients and were hesitant to take on the role of “health policing”. GPs saw primary prevention as the responsibility of multiple actors in a network of societal and municipal institutions. Conclusions The mixed-method study showed that primary prevention approaches such as lifestyle counseling is not well established in primary care. GPs used a selective approach to offer preventive advice based upon indication. GPs had a strong sense that a universal prevention approach carried the potential to destroy a good patient-physician relationship. Other approaches to public health may be warranted such as a multisectoral approach to population health. This type of restructuring of the health care sector may benefit patients who are unable to afford specific prevention programmes and who have competing demands that hinder their ability to focus on behavior change. PMID:24885100

  5. Best Practices for Finite Element Analysis of Spent Nuclear Fuel Transfer, Storage, and Transportation Systems

    SciTech Connect

    Bajwa, Christopher S.; Piotter, Jason; Cuta, Judith M.; Adkins, Harold E.; Klymyshyn, Nicholas A.; Fort, James A.; Suffield, Sarah R.

    2010-08-11

    Storage casks and transportation packages for spent nuclear fuel (SNF) are designed to confine SNF in sealed canisters or casks, provide structural integrity during accidents, and remove decay through a storage or transportation overpack. The transfer, storage, and transportation of SNF in dry storage casks and transport packages is regulated under 10 CFR Part 72 and 10 CFR Part 71, respectively. Finite Element Analysis (FEA) is used with increasing frequency in Safety Analysis Reports and other regulatory technical evaluations related to SNF casks and packages and their associated systems. Advances in computing power have made increasingly sophisticated FEA models more feasible, and as a result, the need for careful review of such models has also increased. This paper identifies best practice recommendations that stem from recent NRC review experience. The scope covers issues common to all commercially available FEA software, and the recommendations are applicable to any FEA software package. Three specific topics are addressed: general FEA practices, issues specific to thermal analyses, and issues specific to structural analyses. General FEA practices covers appropriate documentation of the model and results, which is important for an efficient review process. The thermal analysis best practices are related to cask analysis for steady state conditions and transient scenarios. The structural analysis best practices are related to the analysis of casks and associated payload during standard handling and drop scenarios. The best practices described in this paper are intended to identify FEA modeling issues and provide insights that can help minimize associated uncertainties and errors, in order to facilitate the NRC licensing review process.

  6. Method Development for Analysis of Aspirin Tablets.

    ERIC Educational Resources Information Center

    Street, Kenneth W., Jr.

    1988-01-01

    Develops a lab experiment for introductory instrumental analysis that requires interference studies and optimizing of conditions. Notes the analysis of the aspirin is by visible spectrophotometric assay. Gives experimental details and discussion. (MVL)

  7. Uncertainty Analysis by the "Worst Case" Method.

    ERIC Educational Resources Information Center

    Gordon, Roy; And Others

    1984-01-01

    Presents a new method of uncertainty propagation which concentrates on the calculation of upper and lower limits (the "worst cases"), bypassing absolute and relative uncertainties. Includes advantages of this method and its use in freshmen laboratories, advantages of the traditional method, and a numerical example done by both methods. (JN)

  8. Practical estimates of field-saturated hydraulic conductivity of bedrock outcrops using a modified bottomless bucket method

    USGS Publications Warehouse

    Mirus, Benjamin B.; Perkins, Kim S.

    2012-01-01

    The bottomless bucket (BB) approach (Nimmo et al., 2009a) is a cost-effective method for rapidly characterizing field-saturated hydraulic conductivity Kfs of soils and alluvial deposits. This practical approach is of particular value for quantifying infiltration rates in remote areas with limited accessibility. A similar approach for bedrock outcrops is also of great value for improving quantitative understanding of infiltration and recharge in rugged terrain. We develop a simple modification to the BB method for application to bedrock outcrops, which uses a non-toxic, quick-drying silicone gel to seal the BB to the bedrock. These modifications to the field method require only minor changes to the analytical solution for calculating Kfs on soils. We investigate the reproducibility of the method with laboratory experiments on a previously studied calcarenite rock and conduct a sensitivity analysis to quantify uncertainty in our predictions. We apply the BB method on both bedrock and soil for sites on Pahute Mesa, which is located in a remote area of the Nevada National Security Site. The bedrock BB tests may require monitoring over several hours to days, depending on infiltration rates, which necessitates a cover to prevent evaporative losses. Our field and laboratory results compare well to Kfs values inferred from independent reports, which suggests the modified BB method can provide useful estimates and facilitate simple hypothesis testing. The ease with which the bedrock BB method can be deployed should facilitate more rapid in-situ data collection than is possible with alternative methods for quantitative characterization of infiltration into bedrock.

  9. A practical adaptive-grid method for complex fluid-flow problems

    NASA Technical Reports Server (NTRS)

    Nakahashi, K.; Deiwert, G. S.

    1984-01-01

    A practical solution, adaptive-grid method utilizing a tension and torsion spring analogy is proposed for multidimensional fluid flow problems. The tension spring, which connects adjacent grid points to each other, controls grid spacings. The torsion spring, which is attached to each grid node, controls inclinations of coordinate lines and grid skewness. A marching procedure was used that results in a simple tridiagonal system of equations at each coordinate line to determine grid-point distribution. Multidirectional adaptation is achieved by successive applications of one-dimensional adaptation. Examples of applications for axisymmetric afterbody flow fields and two dimensional transonic airfoil flow fields are shown.

  10. Meta-research: Evaluation and Improvement of Research Methods and Practices.

    PubMed

    Ioannidis, John P A; Fanelli, Daniele; Dunne, Debbie Drake; Goodman, Steven N

    2015-10-01

    As the scientific enterprise has grown in size and diversity, we need empirical evidence on the research process to test and apply interventions that make it more efficient and its results more reliable. Meta-research is an evolving scientific discipline that aims to evaluate and improve research practices. It includes thematic areas of methods, reporting, reproducibility, evaluation, and incentives (how to do, report, verify, correct, and reward science). Much work is already done in this growing field, but efforts to-date are fragmented. We provide a map of ongoing efforts and discuss plans for connecting the multiple meta-research efforts across science worldwide. PMID:26431313

  11. Meta-research: Evaluation and Improvement of Research Methods and Practices

    PubMed Central

    Ioannidis, John P. A.; Fanelli, Daniele; Dunne, Debbie Drake; Goodman, Steven N.

    2015-01-01

    As the scientific enterprise has grown in size and diversity, we need empirical evidence on the research process to test and apply interventions that make it more efficient and its results more reliable. Meta-research is an evolving scientific discipline that aims to evaluate and improve research practices. It includes thematic areas of methods, reporting, reproducibility, evaluation, and incentives (how to do, report, verify, correct, and reward science). Much work is already done in this growing field, but efforts to-date are fragmented. We provide a map of ongoing efforts and discuss plans for connecting the multiple meta-research efforts across science worldwide. PMID:26431313

  12. Thermal Analysis Methods For Earth Entry Vehicle

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.; Dec, John A.; Lindell, Michael C.

    2000-01-01

    Thermal analysis of a vehicle designed to return samples from another planet, such as the Earth Entry vehicle for the Mars Sample Return mission, presents several unique challenges. The Earth Entry Vehicle (EEV) must contain Martian material samples after they have been collected and protect them from the high heating rates of entry into the Earth's atmosphere. This requirement necessitates inclusion of detailed thermal analysis early in the design of the vehicle. This paper will describe the challenges and solutions for a preliminary thermal analysis of an Earth Entry Vehicle. The aeroheating on the vehicle during entry would be the main driver for the thermal behavior, and is a complex function of time, spatial position on the vehicle, vehicle temperature, and trajectory parameters. Thus, the thermal analysis must be closely tied to the aeroheating analysis in order to make accurate predictions. Also, the thermal analysis must account for the material response of the ablative thermal protection system (TPS). For the exo-atmospheric portion of the mission, the thermal analysis must include the orbital radiation fluxes on the surfaces. The thermal behavior must also be used to predict the structural response of the vehicle (the thermal stress and strains) and whether they remain within the capability of the materials. Thus, the thermal analysis requires ties to the three-dimensional geometry, the aeroheating analysis, the material response analysis, the orbital analysis, and the structural analysis. The goal of this paper is to describe to what degree that has been achieved.

  13. International Commercial Remote Sensing Practices and Policies: A Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Stryker, Timothy

    In recent years, there has been much discussion about U.S. commercial remoteUnder the Act, the Secretary of Commerce sensing policies and how effectively theylicenses the operations of private U.S. address U.S. national security, foreignremote sensing satellite systems, in policy, commercial, and public interests.consultation with the Secretaries of Defense, This paper will provide an overview of U.S.State, and Interior. PDD-23 provided further commercial remote sensing laws,details concerning the operation of advanced regulations, and policies, and describe recentsystems, as well as criteria for the export of NOAA initiatives. It will also addressturnkey systems and/or components. In July related foreign practices, and the overall2000, pursuant to the authority delegated to legal context for trade and investment in thisit by the Secretary of Commerce, NOAA critical industry.iss ued new regulations for the industry. Licensing and Regulationsatellite systems. NOAA's program is The 1992 Land Remote Sensing Policy Act ("the Act"), and the 1994 policy on Foreign Access to Remote Sensing Space Capabilities (known as Presidential Decision Directive-23, or PDD-23) put into place an ambitious legal and policy framework for the U.S. Government's licensing of privately-owned, high-resolution satellite systems. Previously, capabilities afforded national security and observes the international obligations of the United States; maintain positive control of spacecraft operations; maintain a tasking record in conjunction with other record-keeping requirements; provide U.S. Government access to and use of data when required for national security or foreign policy purposes; provide for U.S. Government review of all significant foreign agreements; obtain U.S. Government approval for any encryption devices used; make available unenhanced data to a "sensed state" as soon as such data are available and on reasonable cost terms and conditions; make available unenhanced data as requested by the U.S. Government Archive; and, obtain a priori U.S. Government approval of all plans and procedures to deal with safe disposition of the satellite. Further information on NOAA's regulations and NOAA's licensing program is available at www.licensing.noaa.gov. Monitoring and Enforcement NOAA's enforcement mission is focused on the legislative mandate which states that the Secretary of Commerce has a continuing obligation to ensure that licensed imaging systems are operated lawfully to preserve the national security and foreign policies of the United States. NOAA has constructed an end-to-end monitoring and compliance program to review the activities of licensed companies. This program includes a pre- launch review, an operational baseline audit, and an annual comprehensive national security audit. If at any time there is suspicion or concern that a system is being operated unlawfully, a no-notice inspection may be initiated. setbacks, three U.S. companies are now operational, with more firms expected to become so in the future. While NOAA does not disclose specific systems capabilities for proprietary reasons, its current licensing resolution thresholds for general commercial availability are as follows: 0.5 meter Ground Sample Distance (GSD) for panchromatic systems, 2 meter GSD for multi-spectral systems, 3 meter Impulse Response (IPR) for Synthetic Aperture Radar systems, and 20 meter GSD for hyperspectral systems (with certain 8-meter hyperspectral derived products also licensed for commercial distribution). These thresholds are subject to change based upon foreign availability and other considerations. It should also be noted that license applications are reviewed and granted on a case-by-case basis, pursuant to each system's technology and concept of operations. In 2001, NOAA, along with the Department of Commerce's International Trade Administration, commissioned a study by the RAND Corporation to assess the risks faced by the U.S. commercial remote sensing satellite industry. In commissioning this study, NOAA's goal was to bette

  14. Statistical Method for Integrative Platform Analysis: Application to Integration of Proteomic and Microarray Data.

    PubMed

    Gao, Xin

    2016-01-01

    To perform integrative analysis on multiple genomic data sources, we propose to use Fisher's combined probability test for consolidated inference. The method combines the individual p-values from different data sources and constructs a chi-square test statistics for the overall significance. This method is valid to combine results across independent data sources. We further improve the method to accommodate the scenario that the data sources are dependent or the data samples are too small to obtain valid p-values through exact distributions. The proposed method is convenient to use in practice and is robust to distributional assumptions and small sample sizes. PMID:26519179

  15. A practical approach to object based requirements analysis

    NASA Technical Reports Server (NTRS)

    Drew, Daniel W.; Bishop, Michael

    1988-01-01

    Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.

  16. Musical Practices and Methods in Music Lessons: A Comparative Study of Estonian and Finnish General Music Education

    ERIC Educational Resources Information Center

    Sepp, Anu; Ruokonen, Inkeri; Ruismäki, Heikki

    2015-01-01

    This article reveals the results of a comparative study of Estonian and Finnish general music education. The aim was to find out what music teaching practices and approaches/methods were mostly used, what music education perspectives supported those practices. The data were collected using questionnaires and the results of 107 Estonian and 50…

  17. Spelling Practice Intervention: A Comparison of Tablet PC and Picture Cards as Spelling Practice Methods for Students with Developmental Disabilities

    ERIC Educational Resources Information Center

    Seok, Soonhwa; DaCosta, Boaventura; Yu, Byeong Min

    2015-01-01

    The present study compared a spelling practice intervention using a tablet personal computer (PC) and picture cards with three students diagnosed with developmental disabilities. An alternating-treatments design with a non-concurrent multiple-baseline across participants was used. The aims of the present study were: (a) to determine if…

  18. Looking beyond borders: integrating best practices in benefit-risk analysis into the field of food and nutrition.

    PubMed

    Tijhuis, M J; Pohjola, M V; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken-Schröder, G; Poto, M; Tuomisto, J T; Ueland, O; White, B C; Holm, F; Verhagen, H

    2012-01-01

    An integrated benefit-risk analysis aims to give guidance in decision situations where benefits do not clearly prevail over risks, and explicit weighing of benefits and risks is thus indicated. The BEPRARIBEAN project aims to advance benefit-risk analysis in the area of food and nutrition by learning from other fields. This paper constitutes the final stage of the project, in which commonalities and differences in benefit-risk analysis are identified between the Food and Nutrition field and other fields, namely Medicines, Food Microbiology, Environmental Health, Economics and Marketing-Finance, and Consumer Perception. From this, ways forward are characterized for benefit-risk analysis in Food and Nutrition. Integrated benefit-risk analysis in Food and Nutrition may advance in the following ways: Increased engagement and communication between assessors, managers, and stakeholders; more pragmatic problem-oriented framing of assessment; accepting some risk; pre- and post-market analysis; explicit communication of the assessment purpose, input and output; more human (dose-response) data and more efficient use of human data; segmenting populations based on physiology; explicit consideration of value judgments in assessment; integration of multiple benefits and risks from multiple domains; explicit recognition of the impact of consumer beliefs, opinions, views, perceptions, and attitudes on behaviour; and segmenting populations based on behaviour; the opportunities proposed here do not provide ultimate solutions; rather, they define a collection of issues to be taken account of in developing methods, tools, practices and policies, as well as refining the regulatory context, for benefit-risk analysis in Food and Nutrition and other fields. Thus, these opportunities will now need to be explored further and incorporated into benefit-risk practice and policy. If accepted, incorporation of these opportunities will also involve a paradigm shift in Food and Nutrition benefit-risk analysis towards conceiving the analysis as a process of creating shared knowledge among all stakeholders. PMID:22142687

  19. A method for obtaining practical flutter-suppression control laws using results of optimal control theory

    NASA Technical Reports Server (NTRS)

    Newson, J. R.

    1979-01-01

    The results of optimal control theory are used to synthesize a feedback filter. The feedback filter is used to force the output of the filtered frequency response to match that of a desired optimal frequency response over a finite frequency range. This matching is accomplished by employing a nonlinear programing algorithm to search for the coefficients of the feedback filter that minimize the error between the optimal frequency response and the filtered frequency response. The method is applied to the synthesis of an active flutter-suppression control law for an aeroelastic wind-tunnel model. It is shown that the resulting control law suppresses flutter over a wide range of subsonic Mach numbers. This is a promising method for synthesizing practical control laws using the results of optimal control theory.

  20. Practice Makes Perfect: Improving Students' Skills in Understanding and Avoiding Plagiarism with a Themed Methods Course

    ERIC Educational Resources Information Center

    Estow, Sarah; Lawrence, Eva K.; Adams, Kathrynn A.

    2011-01-01

    To address the issue of plagiarism, students in two undergraduate Research Methods and Analysis courses conducted, analyzed, and wrote up original research on the topic of plagiarism. Students in an otherwise identical course completed the same assignments but examined a different research topic. At the start and end of the semester, all students…

  1. Reforming High School Science for Low-Performing Students Using Inquiry Methods and Communities of Practice

    NASA Astrophysics Data System (ADS)

    Bolden, Marsha Gail

    Some schools fall short of the high demand to increase science scores on state exams because low-performing students enter high school unprepared for high school science. Low-performing students are not successful in high school for many reasons. However, using inquiry methods have improved students' understanding of science concepts. The purpose of this qualitative research study was to investigate the teachers' lived experiences with using inquiry methods to motivate low-performing high school science students in an inquiry-based program called Xtreem Science. Fifteen teachers were selected from the Xtreem Science program, a program designed to assist teachers in motivating struggling science students. The research questions involved understanding (a) teachers' experiences in using inquiry methods, (b) challenges teachers face in using inquiry methods, and (c) how teachers describe student's response to inquiry methods. Strategy of data collection and analysis included capturing and understanding the teachers' feelings, perceptions, and attitudes in their lived experience of teaching using inquiry method and their experience in motivating struggling students. Analysis of interview responses revealed teachers had some good experiences with inquiry and expressed that inquiry impacted their teaching style and approach to topics, and students felt that using inquiry methods impacted student learning for the better. Inquiry gave low-performing students opportunities to catch up and learn information that moved them to the next level of science courses. Implications for positive social change include providing teachers and school district leaders with information to help improve performance of the low performing science students.

  2. Practical analysis of tide gauges records from Antarctica

    NASA Astrophysics Data System (ADS)

    Galassi, Gaia; Spada, Giorgio

    2015-04-01

    We have collected and analyzed in a basic way the currently available time series from tide gauges deployed along the coasts of Antarctica. The database of the Permanent Service for Mean Sea Level (PSMSL) holds relative sea level information for 17 stations, which are mostly concentrated in the Antarctic Peninsula (8 out of 17). For 7 of the PSMSL stations, Revised Local Reference (RLR) monthly and yearly observations are available, spanning from year 1957.79 (Almirante Brown) to 2013.95 (Argentine Islands). For the remaining 11 stations, only metric monthly data can be obtained during the time window 1957-2013. The record length of the available time series is not generally exceeding 20 years. Remarkable exceptions are the RLR station of Argentine Island, located in the Antarctic Peninsula (AP) (time span: 1958-2013, record length: 54 years, completeness=98%), and the metric station of Syowa in East Antarctica (1975-2012, 37 years, 92%). The general quality (geographical coverage and length of record) of the time series hinders a coherent geophysical interpretation of the relative sea-level data along the coasts of Antarctica. However, in an attempt to characterize the relative sea level signals available, we have stacked (i.e., averaged) the RLR time series for the AP and for the whole Antarctica. The so obtained time series have been analyzed using simple regression in order to estimate a trend and a possible sea-level acceleration. For the AP, the the trend is 1.8 ± 0.2 mm/yr and for the whole Antarctica it is 2.1 ± 0.1 mm/yr (both during 1957-2013). The modeled values of Glacial Isostatic Adjustment (GIA) obtained with ICE-5G(VM2) using program SELEN, range between -0.7 and -1.6 mm/yr, showing that the sea-level trend recorded by tide gauges is strongly influenced by GIA. Subtracting the average GIA contribution (-1.1 mm/yr) to observed sea-level trend from the two stacks, we obtain 3.2 and 2.9 mm/yr for Antarctica and AP respectively, which are interpreted as the effect of current ice melting and steric ocean contributions. By the Ensemble Empirical Mode Decomposition method, we have detected different oscillations embedded in the sea-level signals for Antarctica and AP. This confirms previously recognized connections between the sea-level variations in Antarctica and ocean modes like the ENSO.

  3. Causal Network Methods for Integrated Project Portfolio Risk Analysis 

    E-print Network

    Govan, Paul

    2014-08-06

    Corporate portfolio risk analysis is of primary concern for many organizations, as the success of strategic objectives greatly depends on an accurate risk assessment. Current risk analysis methods typically involve statistical models of risk...

  4. Peering inside the Clock: Using Success Case Method to Determine How and Why Practice-Based Educational Interventions Succeed

    ERIC Educational Resources Information Center

    Olson, Curtis A.; Shershneva, Marianna B.; Brownstein, Michelle Horowitz

    2011-01-01

    Introduction: No educational method or combination of methods will facilitate implementation of clinical practice guidelines in all clinical contexts. To develop an empirical basis for aligning methods to contexts, we need to move beyond "Does it work?" to also ask "What works for whom and under what conditions?" This study employed Success Case…

  5. Methods for Mediation Analysis with Missing Data

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Wang, Lijuan

    2013-01-01

    Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum…

  6. Abstract: Using System Dynamics Analysis for Evaluating the Sustainability of “Complete Streets” Practices

    EPA Science Inventory

    Abstract: Using System Dynamics Analysis for Evaluating the Sustainability of “Complete Streets” Practices Primary Author: Nicholas R. Flanders 109 T.W. Alexander Drive Mail Code: E343-02 Research Triangle Park, NC 27709 919-541-3660 Flanders.nick@Epa.gov Topic categ...

  7. The Community College and a Rising Global Imaginary: An Analysis of Practical Reasoning, 1950-2013

    ERIC Educational Resources Information Center

    Ayers, David F.; Palmadessa, Allison L.

    2015-01-01

    Through an analysis of 245 issues of the "Community College Journal" published between 1950 and 2013, we show how three discourses--international understanding and geopolitics, economic competitiveness, and global citizenship--informed practical reasoning about a rising global imaginary and its implications for the community college. By…

  8. A Discourse Analytic Approach to Video Analysis of Teaching: Aligning Desired Identities with Practice

    ERIC Educational Resources Information Center

    Schieble, Melissa; Vetter, Amy; Meacham, Mark

    2015-01-01

    The authors present findings from a qualitative study of an experience that supports teacher candidates to use discourse analysis and positioning theory to analyze videos of their practice during student teaching. The research relies on the theoretical concept that learning to teach is an identity process. In particular, teachers construct and…

  9. Using Video Analysis or Data Loggers during Practical Work in First Year Physics.

    ERIC Educational Resources Information Center

    Rodrigues, Susan; Pearce, Jon; Livett, Michelle

    2001-01-01

    Reports on a project investigating students' learning processes when video analysis and data logging practical work were used in a first-year undergraduate physics course. Suggests that students were motivated by the tasks and believed that these tasks helped them overall to understand physics concepts. Includes references. (CMK)

  10. AAMFT Master Series Tapes: An Analysis of the Inclusion of Feminist Principles into Family Therapy Practice.

    ERIC Educational Resources Information Center

    Haddock, Shelley A.; MacPhee, David; Zimmerman, Toni Schindler

    2001-01-01

    Content analysis of 23 American Association for Marriage and Family Therapy Master Series tapes was used to determine how well feminist behaviors have been incorporated into ideal family therapy practice. Feminist behaviors were infrequent, being evident in fewer than 3% of time blocks in event sampling and 10 of 39 feminist behaviors of the…

  11. A PROBABILISTIC APPROACH FOR ANALYSIS OF UNCERTAINTY IN THE EVALUATION OF WATERSHED MANAGEMENT PRACTICES

    EPA Science Inventory

    A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (<10 km2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...

  12. Identifying Evidence-Based Practices in Special Education through High Quality Meta-Analysis

    ERIC Educational Resources Information Center

    Friedt, Brian

    2012-01-01

    The purpose of this study was to determine if meta-analysis can be used to enhance efforts to identify evidence-based practices (EBPs). In this study, the quality of included studies acted as the moderating variable. I used the quality indicators for experimental and quasi-experimental research developed by Gersten, Fuchs, Coyne, Greenwood, and…

  13. A Quantitative Analysis and Natural History of B. F. Skinner's Coauthoring Practices

    PubMed Central

    McKerchar, Todd L; Morris, Edward K; Smith, Nathaniel G

    2011-01-01

    This paper describes and analyzes B. F. Skinner's coauthoring practices. After identifying his 35 coauthored publications and 27 coauthors, we analyze his coauthored works by their form (e.g., journal articles) and kind (e.g., empirical); identify the journals in which he published and their type (e.g., data-type); describe his overall and local rates of publishing with his coauthors (e.g., noting breaks in the latter); and compare his coauthoring practices with his single-authoring practices (e.g., form, kind, journal type) and with those in the scientometric literature (e.g., majority of coauthored publications are empirical). We address these findings in the context of describing the natural history of Skinner's coauthoring practices. Finally, we describe some limitations in our methods and offer suggestions for future research. PMID:22532732

  14. Analysis of poverty data by small area methods

    E-print Network

    Opsomer, Jean

    Analysis of poverty data by small area methods #12;#12;CONTENTS 1 Nonparametric regression methods, with a particular focus on issues that are relevant in the estimation of poverty indices. As elsewhere in the book

  15. Common cause analysis : a review and extension of existing methods

    E-print Network

    Heising, Carolyn D.

    1982-01-01

    The quantitative common cause analysis code, MOBB, is extended to include uncertainties arising from modelling uncertainties and data uncertainties. Two methods, Monte Carlo simulation and the Method-of-Moments are used ...

  16. Methods for sampling and inorganic analysis of coal

    USGS Publications Warehouse

    Golightly, D. W., (Edited By); Simon, Frederick Otto

    1989-01-01

    Methods used by the U.S. Geological Survey for the sampling, comminution, and inorganic analysis of coal are summarized in this bulletin. Details, capabilities, and limitations of the methods are presented.

  17. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in cacao products shall be determined by the following methods of...

  18. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in cacao products shall be determined by the following methods of...

  19. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in cacao products shall be determined by the following methods of...

  20. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in cacao products shall be determined by the following methods of...

  1. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in cacao products shall be determined by the following methods of...

  2. Adaptive Nodal Transport Methods for Reactor Transient Analysis

    SciTech Connect

    Thomas Downar; E. Lewis

    2005-08-31

    Develop methods for adaptively treating the angular, spatial, and time dependence of the neutron flux in reactor transient analysis. These methods were demonstrated in the DOE transport nodal code VARIANT and the US NRC spatial kinetics code, PARCS.

  3. Addiction profile in probation practices in Turkey: 5-year data analysis

    PubMed Central

    Yazici, Ahmet Bülent; Yazici, Esra; Akkisi Kumsar, Neslihan; Erol, Atila

    2015-01-01

    Introduction While drug addiction is a global problem, it is important for every region to know the general features of its own addicts in order to develop effective treatment programs. This study presents sociodemographic data of the individuals diagnosed with drug addiction. Methods In this study, data of the patients between the years of 2009 and 2014 were retrospectively analyzed. The patients were assessed at psychiatry polyclinics according to probation practices of drug abuse. The study involved 513 patients in whom drug positivity was detected in urine analysis at least once and whose diagnoses were confirmed with a clinical interview. Results According to this study, a majority of the addicts were detected in 2013. Males made up 98.2% of the sample population, their average age was 32.12±10.21 years, and minimum and maximum ages for the first drug use were 7 years and 45 years, respectively. Marijuana use was found in 90.8% of the patients, 90% of them were living with their families, and 59.6% of them held a regular job. Treatment response was related with age of first use of drugs, duration of use drugs, and prior treatment anamnesis of the patients. Conclusion In this study, it was determined that the drug which was most frequently was used marijuana. The risk of drug addiction can affect any individual in society, regardless of their education, occupation, or social support levels. Alternative treatment models, especially for chronic and long-term users, should be researched. PMID:26345237

  4. RAPID ON-SITE METHODS OF CHEMICAL ANALYSIS

    EPA Science Inventory

    The analysis of potentially hazardous air, water and soil samples collected and shipped to service laboratories off-site is time consuming and expensive. This Chapter addresses the practical alternative of performing the requisite analytical services on-site. The most significant...

  5. Design of a practical model-observer-based image quality assessment method for CT imaging systems

    NASA Astrophysics Data System (ADS)

    Tseng, Hsin-Wu; Fan, Jiahua; Cao, Guangzhi; Kupinski, Matthew A.; Sainath, Paavana

    2014-03-01

    The channelized Hotelling observer (CHO) is a powerful method for quantitative image quality evaluations of CT systems and their image reconstruction algorithms. It has recently been used to validate the dose reduction capability of iterative image-reconstruction algorithms implemented on CT imaging systems. The use of the CHO for routine and frequent system evaluations is desirable both for quality assurance evaluations as well as further system optimizations. The use of channels substantially reduces the amount of data required to achieve accurate estimates of observer performance. However, the number of scans required is still large even with the use of channels. This work explores different data reduction schemes and designs a new approach that requires only a few CT scans of a phantom. For this work, the leave-one-out likelihood (LOOL) method developed by Hoffbeck and Landgrebe is studied as an efficient method of estimating the covariance matrices needed to compute CHO performance. Three different kinds of approaches are included in the study: a conventional CHO estimation technique with a large sample size, a conventional technique with fewer samples, and the new LOOL-based approach with fewer samples. The mean value and standard deviation of area under ROC curve (AUC) is estimated by shuffle method. Both simulation and real data results indicate that an 80% data reduction can be achieved without loss of accuracy. This data reduction makes the proposed approach a practical tool for routine CT system assessment.

  6. Improved permeability prediction using multivariate analysis methods 

    E-print Network

    Xie, Jiang

    2009-05-15

    Predicting rock permeability from well logs in uncored wells is an important task in reservoir characterization. Due to the high costs of coring and laboratory analysis, typically cores are acquired in only a few wells. Since most wells are logged...

  7. Stability Analysis for the Immersed Boundary Method

    NASA Astrophysics Data System (ADS)

    Gong, Z. X.; Huang, H. X.; Lu, C. J.

    In this paper, we analyse the stablity of the Immersed Boundary Method applied to a membrane-fluid system with a plasma membrane immersed in an incompressible viscous fluid. For small deformations, the immersed boundary method, using a standard regularization technique for the singular force, is shown to be linearly stable.

  8. PIC (PRODUCTS OF INCOMPLETE COMBUSTION) ANALYSIS METHODS

    EPA Science Inventory

    The report gives results of method evaluations for products of incomplete combustion (PICs): 36 proposed PICs were evaluated by previously developed gas chromatography/flame ionization detection (GC/FID) and gas chromatography/mass spectroscopy (GC/MS) methods. It also gives resu...

  9. METHODS FOR SAMPLING AND ANALYSIS OF BREATH

    EPA Science Inventory

    The research program surveyed and evaluated the methods and procedures used to identify and quantitate chemical constituents in human breath. Methods have been evaluated to determine their ease and rapidity, as well as cost, accuracy, and precision. During the evaluation, a secon...

  10. Generic residue analysis and BV method comparison

    NASA Astrophysics Data System (ADS)

    Dorville, Nicolas; Anekallu, Chandra; Haaland, Stein; Belmont, Gerard

    2015-04-01

    Determining the orientation of the normal direction to the magnetopause layer is a key issue for studying in detail the structure of this boundary. Both conservation laws methods and the new iterative BV method, that performs a fit of the magnetic field and ion normal flow velocity with an elliptic model, have been developed for this purpose. These methods have different model assumptions and validity ranges. Unlike the conservation laws methods, the BV method also provides spatial profiles inside the layer. However, it is compatible only with a subset of magnetopause crossings with a single layer current sheet. We compare here their results on artificial magnetopause data with noise, to understand their sensibility to small departures from their physical hypothesis. Then we present a statistical study on their comparison on a list of 149 flank and dayside magnetopause crossings.

  11. Propensity scores: a practical method for assessing treatment effects in pain and symptom management research.

    PubMed

    Garrido, Melissa M

    2014-10-01

    When conducting research on pain and symptom management interventions for seriously ill individuals, randomized controlled trials are not always feasible or ethical to conduct. Secondary analyses of observational data sets that include information on treatments experienced and outcomes for individuals who did and did not receive a given treatment can be conducted, but confounding because of selection bias can obscure the treatment effect in which one is interested. Propensity scores provide a way to adjust for observable characteristics that differ between treatment and comparison groups. This article provides conceptual guidance in addition to an empirical example to illustrate two areas of propensity score analysis that often lead to confusion in practice: covariate selection and interpretation of resultant treatment effects. PMID:24937162

  12. Concepts, tools/methods, and practices of water-energy-food NEXUS

    NASA Astrophysics Data System (ADS)

    Endo, A.; Tsurita, I.; Orencio, P. M.; Taniguchi, M.

    2014-12-01

    The needs to consider the NEXUS on food and water were emphasized in international dialogues and publications around the end of the 20th century. In fact, in 1983, the United Nations University already launched a Food-Energy Nexus Programme to fill the gaps between the issues of food and energy. The term "NEXUS" to link water, food, and trade was also used in the World Bank during 1990s. The idea of NEXUS is likely to have further developed under the discussion of "virtual water" and "water footprints". With experiencing several international discussions such as Kyoto World Water Forum 2003, scholars and practitioners around the globe acknowledged the need to include energy for the pillars of NEXUS. Finally, the importance of three NEXUS pillars, "water, energy, and food" was officially announced in the BONN 2011 NEXUS Conference, which is a turning point of NEXUS idea in the international community , in order to contribute to the United Nations Conference on Sustainable Development (Rio+20) in 2012 that highlighted the concept of "green economy". The concept of NEXUS is becoming a requisite to achieve sustainable development due to the global concerns embedded in society, economy, and environment. The concept stresses to promote the cooperation with the sectors such as water, energy, food, and climate change since these complex global issues are dependent and inter-connected, which can no longer be solved by the sectorial approaches. The NEXUS practices are currently shared among different stakeholders through various modes including literatures, conferences, workshops, and research projects. However, since the NEXUS practices are not led by a particular organization, its concept, theory, policy, tools, methods, and applications are diverse and incoherent. In terms of tools/methods, the potential of integrated modeling approach is introduced to avoid pressures and to promote interactions among water, energy and food. This paper explores the concepts, tools/methods, and practices of water-energy-food NEXUS to evaluate human environmental security under the RIHN project on "Human-Environmental Security in the Asia-Pacific Ring of Fire: Water-Energy-Food Nexus".

  13. Shear Lag in Box Beams Methods of Analysis and Experimental Investigations

    NASA Technical Reports Server (NTRS)

    Kuhn, Paul; Chiarito, Patrick T

    1942-01-01

    The bending stresses in the covers of box beams or wide-flange beams differ appreciably from the stresses predicted by the ordinary bending theory on account of shear deformation of the flanges. The problem of predicting these differences has become known as the shear-lag problem. The first part of this paper deals with methods of shear-lag analysis suitable for practical use. The second part of the paper describes strain-gage tests made by the NACA to verify the theory. Three tests published by other investigators are also analyzed by the proposed method. The third part of the paper gives numerical examples illustrating the methods of analysis. An appendix gives comparisons with other methods, particularly with the method of Ebner and Koller.

  14. In-Service Teacher Training in Japan and Turkey: A Comparative Analysis of Institutions and Practices

    ERIC Educational Resources Information Center

    Bayrakci, Mustafa

    2009-01-01

    The purpose of this study is to compare policies and practices relating to teacher in-service training in Japan and Turkey. On the basis of the findings of the study, suggestions are made about in-service training activities in Turkey. The research was carried using qualitative research methods. In-service training activities in the two education…

  15. Assessing performance of conservation-based Best Management Practices: Coarse vs. fine-scale analysis

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Background/Questions/Methods Animal agriculture in the Spring Creek watershed of central Pennsylvania contributes sediment to the stream and ultimately to the Chesapeake Bay. Best Management Practices (BMPs) such as streambank buffers are intended to intercept sediment moving from heavy-use areas to...

  16. An Analysis of Agricultural Mechanics Safety Practices in Agricultural Science Laboratories.

    ERIC Educational Resources Information Center

    Swan, Michael K.

    North Dakota secondary agricultural mechanics instructors were surveyed regarding instructional methods and materials, safety practices, and equipment used in the agricultural mechanics laboratory. Usable responses were received from 69 of 89 instructors via self-administered mailed questionnaires. Findings were consistent with results of similar…

  17. Confirmatory Factor Analysis on the Professional Suitability Scale for Social Work Practice

    ERIC Educational Resources Information Center

    Tam, Dora M. Y.; Twigg, Robert C.; Boey, Kam-Wing; Kwok, Siu-Ming

    2013-01-01

    Objective: This article presents a validation study to examine the factor structure of an instrument designed to measure professional suitability for social work practice. Method: Data were collected from registered social workers in a provincial mailed survey. The response rate was 23.2%. After eliminating five cases with multivariate outliers,…

  18. A Qualitative Analysis of an Advanced Practice Nurse-Directed Transitional Care Model Intervention

    ERIC Educational Resources Information Center

    Bradway, Christine; Trotta, Rebecca; Bixby, M. Brian; McPartland, Ellen; Wollman, M. Catherine; Kapustka, Heidi; McCauley, Kathleen; Naylor, Mary D.

    2012-01-01

    Purpose: The purpose of this study was to describe barriers and facilitators to implementing a transitional care intervention for cognitively impaired older adults and their caregivers lead by advanced practice nurses (APNs). Design and Methods: APNs implemented an evidence-based protocol to optimize transitions from hospital to home. An…

  19. Intravaginal Practices, Vaginal Infections and HIV Acquisition: Systematic Review and Meta-Analysis

    PubMed Central

    Chersich, Matthew; Scott, Pippa; Redmond, Shelagh; Bender, Nicole; Miotti, Paolo; Temmerman, Marleen; Low, Nicola

    2010-01-01

    Background Intravaginal practices are commonly used by women to manage their vaginal health and sexual life. These practices could, however, affect intravaginal mucosal integrity. The objectives of this study were to examine evidence for associations between: intravaginal practices and acquisition of HIV infection; intravaginal practices and vaginal infections; and vaginal infections and HIV acquisition. Methodology/Principal Findings We conducted a systematic review of prospective longitudinal studies, searching 15 electronic databases of journals and abstracts from two international conferences to 31st January 2008. Relevant articles were selected and data extracted in duplicate. Results were examined visually in forest plots and combined using random effects meta-analysis where appropriate. Of 2120 unique references we included 22 publications from 15 different studies in sub-Saharan Africa and the USA. Seven publications from five studies examined a range of intravaginal practices and HIV infection. No specific vaginal practices showed a protective effect against HIV or vaginal infections. Insertion of products for sex was associated with HIV in unadjusted analyses; only one study gave an adjusted estimate, which showed no association (hazard ratio 1.09, 95% confidence interval, CI 0.71, 1.67). HIV incidence was higher in women reporting intravaginal cleansing but confidence intervals were wide and heterogeneity high (adjusted hazard ratio 1.88, 95%CI 0.53, 6.69, I2 83.2%). HIV incidence was higher in women with bacterial vaginosis (adjusted effect 1.57, 95%CI 1.26, 1.94, I2 19.0%) and Trichomonas vaginalis (adjusted effect 1.64, 95%CI 1.28, 2.09, I2 0.0%). Conclusions/Significance A pathway linking intravaginal cleaning practices with vaginal infections that increase susceptibility to HIV infection is plausible but conclusive evidence is lacking. Intravaginal practices do not appear to protect women from vaginal infections or HIV and some might be harmful. PMID:20161749

  20. Methodology for social accountability: multiple methods and feminist, poststructural, psychoanalytic discourse analysis.

    PubMed

    Phillips, D A

    2001-06-01

    Bridging the gap between the individual and social context, methodology that aims to surface and explore the regulatory function of discourse on subjectivity production moves nursing research beyond the individual level in order to theorize social context and its influence on health and well-being. This article describes the feminist, poststructural, psychoanalytic discourse analysis and multiple methods used in a recent study exploring links between cultural discourses of masculinity, performativity of masculinity, and practices of male violence. PMID:11393249

  1. Method for chromium analysis and speciation

    DOEpatents

    Aiken, Abigail M.; Peyton, Brent M.; Apel, William A.; Petersen, James N.

    2004-11-02

    A method of detecting a metal in a sample comprising a plurality of metal is disclosed. The method comprises providing the sample comprising a metal to be detected. The sample is added to a reagent solution comprising an enzyme and a substrate, where the enzyme is inhibited by the metal to be detected. An array of chelating agents is used to eliminate the inhibitory effects of additional metals in the sample. An enzymatic activity in the sample is determined and compared to an enzymatic activity in a control solution to detect the metal to be detected. A method of determining a concentration of the metal in the sample is also disclosed. A method of detecting a valence state of a metal is also disclosed.

  2. Generalized finite element method for multiscale analysis 

    E-print Network

    Zhang, Lin

    2004-11-15

    This dissertation describes a new version of the Generalized Finite Element Method (GFEM), which is well suited for problems set in domains with a large number of internal features (e.g. voids, inclusions, etc.), which are ...

  3. CHAPTER 7. BERYLLIUM ANALYSIS BY NON-PLASMA BASED METHODS

    SciTech Connect

    Ekechukwu, A

    2009-04-20

    The most common method of analysis for beryllium is inductively coupled plasma atomic emission spectrometry (ICP-AES). This method, along with inductively coupled plasma mass spectrometry (ICP-MS), is discussed in Chapter 6. However, other methods exist and have been used for different applications. These methods include spectroscopic, chromatographic, colorimetric, and electrochemical. This chapter provides an overview of beryllium analysis methods other than plasma spectrometry (inductively coupled plasma atomic emission spectrometry or mass spectrometry). The basic methods, detection limits and interferences are described. Specific applications from the literature are also presented.

  4. PRACTICAL METHOD FOR ESTIMATING NEUTRON CROSS SECTION COVARIANCES IN THE RESONANCE REGION

    SciTech Connect

    Cho, Y.S.; Oblozinsky, P.; Mughabghab,S.F.; Mattoon,C.M.; Herman,M.

    2010-04-30

    Recent evaluations of neutron cross section covariances in the resolved resonance region reveal the need for further research in this area. Major issues include declining uncertainties in multigroup representations and proper treatment of scattering radius uncertainty. To address these issues, the present work introduces a practical method based on kernel approximation using resonance parameter uncertainties from the Atlas of Neutron Resonances. Analytical expressions derived for average cross sections in broader energy bins along with their sensitivities provide transparent tool for determining cross section uncertainties. The role of resonance-resonance and bin-bin correlations is specifically studied. As an example we apply this approach to estimate (n,{gamma}) and (n,el) covariances for the structural material {sup 55}Mn.

  5. Physical methods for intracellular delivery: practical aspects from laboratory use to industrial-scale processing.

    PubMed

    Meacham, J Mark; Durvasula, Kiranmai; Degertekin, F Levent; Fedorov, Andrei G

    2014-02-01

    Effective intracellular delivery is a significant impediment to research and therapeutic applications at all processing scales. Physical delivery methods have long demonstrated the ability to deliver cargo molecules directly to the cytoplasm or nucleus, and the mechanisms underlying the most common approaches (microinjection, electroporation, and sonoporation) have been extensively investigated. In this review, we discuss established approaches, as well as emerging techniques (magnetofection, optoinjection, and combined modalities). In addition to operating principles and implementation strategies, we address applicability and limitations of various in vitro, ex vivo, and in vivo platforms. Importantly, we perform critical assessments regarding (1) treatment efficacy with diverse cell types and delivered cargo molecules, (2) suitability to different processing scales (from single cell to large populations), (3) suitability for automation/integration with existing workflows, and (4) multiplexing potential and flexibility/adaptability to enable rapid changeover between treatments of varied cell types. Existing techniques typically fall short in one or more of these criteria; however, introduction of micro-/nanotechnology concepts, as well as synergistic coupling of complementary method(s), can improve performance and applicability of a particular approach, overcoming barriers to practical implementation. For this reason, we emphasize these strategies in examining recent advances in development of delivery systems. PMID:23813915

  6. Physical Methods for Intracellular Delivery: Practical Aspects from Laboratory Use to Industrial-Scale Processing

    PubMed Central

    Meacham, J. Mark; Durvasula, Kiranmai; Degertekin, F. Levent; Fedorov, Andrei G.

    2015-01-01

    Effective intracellular delivery is a significant impediment to research and therapeutic applications at all processing scales. Physical delivery methods have long demonstrated the ability to deliver cargo molecules directly to the cytoplasm or nucleus, and the mechanisms underlying the most common approaches (microinjection, electroporation, and sonoporation) have been extensively investigated. In this review, we discuss established approaches, as well as emerging techniques (magnetofection, optoinjection, and combined modalities). In addition to operating principles and implementation strategies, we address applicability and limitations of various in vitro, ex vivo, and in vivo platforms. Importantly, we perform critical assessments regarding (1) treatment efficacy with diverse cell types and delivered cargo molecules, (2) suitability to different processing scales (from single cell to large populations), (3) suitability for automation/integration with existing workflows, and (4) multiplexing potential and flexibility/adaptability to enable rapid changeover between treatments of varied cell types. Existing techniques typically fall short in one or more of these criteria; however, introduction of micro-/nanotechnology concepts, as well as synergistic coupling of complementary method(s), can improve performance and applicability of a particular approach, overcoming barriers to practical implementation. For this reason, we emphasize these strategies in examining recent advances in development of delivery systems. PMID:23813915

  7. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Methods of analysis. 163.5 Section 163.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in cacao products shall be determined by...

  8. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Methods of analysis. 133.5 Section 133.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture, milkfat, and phosphatase levels...

  9. Environmental Impact Analysis: Philosophy and Methods.

    ERIC Educational Resources Information Center

    Ditton, Robert B.; Goodale, Thomas L.

    Proceedings of the Conference on Environmental Impact Analysis held in Green Bay, Wisconsin, January 4-5, 1972, are compiled in this report. The conference served as a forum for exchange of information among State and Federal agencies and educators on experiences with the National Environmental Policy Act of 1970. Hopefully, results of the…

  10. Analysis of Two Methods to Evaluate Antioxidants

    ERIC Educational Resources Information Center

    Tomasina, Florencia; Carabio, Claudio; Celano, Laura; Thomson, Leonor

    2012-01-01

    This exercise is intended to introduce undergraduate biochemistry students to the analysis of antioxidants as a biotechnological tool. In addition, some statistical resources will also be used and discussed. Antioxidants play an important metabolic role, preventing oxidative stress-mediated cell and tissue injury. Knowing the antioxidant content…

  11. Replicating Cluster Analysis: Method, Consistency, and Validity.

    ERIC Educational Resources Information Center

    Breckenridge, James N.

    1989-01-01

    A Monte Carlo study evaluated the effectiveness of three rules of classifying objects into clusters: nearest neighbor classification; nearest centroid assignment; and quadratic discriminant analysis. Results suggest that the nearest neighbor rule is a useful tool for assessing the validity of the clustering procedure of J. H. Ward (1963). (SLD)

  12. COAL SAMPLING AND ANALYSIS: METHODS AND MODELS

    EPA Science Inventory

    The report provides information on coal sampling and analysis (CSD) techniques and procedures and presents a statistical model for estimating SO2 emissions. (New Source Performance Standards for large coal-fired boilers and certain State Implementation Plans require operators to ...

  13. Methods for Analysis of Seismic Michael Fehler

    E-print Network

    Seismograms Vs. Distance from an Earthquake in Japan Continuous wavetrains between P and S arrivals and those) discussed continuous wave trains in tail portion of seismograms of local earthquakes · Seismic Coda Seismogram P Coda S Coda Direct P Direct S #12;Initial Analysis Approach: A A t e Q Qt Q c c ~ / 0 21

  14. A modified VIKOR method for multiple criteria analysis.

    PubMed

    Chang, Chia-Ling

    2010-09-01

    The VIKOR method was developed to solve multiple criteria decision making (MCDM) problems with conflicting or non-commensurable criteria. This method assumes that compromising is acceptable for conflicting resolution. Although the VIKOR method is a popular method applied in multi-criteria analysis (MCA), it has some problems when solving MCDM problems. This study discussed existing problems in the traditional VIKOR method. The objective of this study was to develop a modified VIKOR method to avoid numerical difficulties in solving problems by the traditional VIKOR method. Several synthetic experiments were designed and assessed to verify the improvement of solution efficiency of the modified VIKOR method in MCA. PMID:19672684

  15. Participant Interaction in Asynchronous Learning Environments: Evaluating Interaction Analysis Methods

    ERIC Educational Resources Information Center

    Blanchette, Judith

    2012-01-01

    The purpose of this empirical study was to determine the extent to which three different objective analytical methods--sequence analysis, surface cohesion analysis, and lexical cohesion analysis--can most accurately identify specific characteristics of online interaction. Statistically significant differences were found in all points of…

  16. Acceleration of reverse analysis method using hyperbolic activation function

    NASA Astrophysics Data System (ADS)

    Pwasong, Augustine; Sathasivam, Saratha

    2015-10-01

    Hyperbolic activation function is examined for its ability to accelerate the performance of doing data mining by using a technique named as Reverse Analysis method. In this paper, we describe how Hopfield network perform better with hyperbolic activation function and able to induce logical rules from large database by using reverse analysis method: given the values of the connections of a network, we can hope to know what logical rules are entrenched in the database. We limit our analysis to Horn clauses.

  17. Web-Based Systems Development: Analysis and Comparison of Practices in Croatia and Ireland

    NASA Astrophysics Data System (ADS)

    Lang, Michael; Vukovac, Dijana Plantak

    The “dot.com” hysteria which sparked fears of a “Web crisis” a decade ago has long subsided and firms established in the 1990 s now have mature development processes in place. This chapter presents a timely re-assessment of the state of Web development practices, comparing data gathered in Croatia and Ireland. Given the growth in popularity of “agile” methods in the past few years, a secondary objective of this research was to analyse the extent to which Web development practices are guided by or otherwise consistent with the underlying principles of agile development.

  18. Radiation analysis devices, radiation analysis methods, and articles of manufacture

    DOEpatents

    Roybal, Lyle Gene

    2010-06-08

    Radiation analysis devices include circuitry configured to determine respective radiation count data for a plurality of sections of an area of interest and combine the radiation count data of individual of sections to determine whether a selected radioactive material is present in the area of interest. An amount of the radiation count data for an individual section is insufficient to determine whether the selected radioactive material is present in the individual section. An article of manufacture includes media comprising programming configured to cause processing circuitry to perform processing comprising determining one or more correction factors based on a calibration of a radiation analysis device, measuring radiation received by the radiation analysis device using the one or more correction factors, and presenting information relating to an amount of radiation measured by the radiation analysis device having one of a plurality of specified radiation energy levels of a range of interest.

  19. Enriching Careers and Lives: Introducing a Positive, Holistic, and Narrative Career Counseling Method that Bridges Theory and Practice

    ERIC Educational Resources Information Center

    Zikic, Jelena; Franklin, Mark

    2010-01-01

    CareerCycles (CC) career counseling framework and method of practice integrates and builds on aspects of positive psychology. Through its holistic and narrative approach, the CC method seeks to collaboratively identify and understand clients' career and life stories. It focuses on their strengths, desires, preferences, assets, future…

  20. Promoting recovery-oriented practice in mental health services: a quasi-experimental mixed-methods study

    PubMed Central

    2013-01-01

    Background Recovery has become an increasingly prominent concept in mental health policy internationally. However, there is a lack of guidance regarding organisational transformation towards a recovery orientation. This study evaluated the implementation of recovery-orientated practice through training across a system of mental health services. Methods The intervention comprised four full-day workshops and an in-team half-day session on supporting recovery. It was offered to 383 staff in 22 multidisciplinary community and rehabilitation teams providing mental health services across two contiguous regions. A quasi-experimental design was used for evaluation, comparing behavioural intent with staff from a third contiguous region. Behavioural intent was rated by coding points of action on the care plans of a random sample of 700 patients (400 intervention, 300 control), before and three months after the intervention. Action points were coded for (a) focus of action, using predetermined categories of care; and (b) responsibility for action. Qualitative inquiry was used to explore staff understanding of recovery, implementation in services and the wider system, and the perceived impact of the intervention. Semi-structured interviews were conducted with 16 intervention group team leaders post-training and an inductive thematic analysis undertaken. Results A total of 342 (89%) staff received the intervention. Care plans of patients in the intervention group had significantly more changes with evidence of change in the content of patient’s care plans (OR 10.94. 95% CI 7.01-17.07) and the attributed responsibility for the actions detailed (OR 2.95, 95% CI 1.68-5.18). Nine themes emerged from the qualitative analysis split into two superordinate categories. ‘Recovery, individual and practice’, describes the perception and provision of recovery orientated care by individuals and at a team level. It includes themes on care provision, the role of hope, language of recovery, ownership and multidisciplinarity. ‘Systemic implementation’, describes organizational implementation and includes themes on hierarchy and role definition, training approaches, measures of recovery and resources. Conclusions Training can provide an important mechanism for instigating change in promoting recovery-orientated practice. However, the challenge of systemically implementing recovery approaches requires further consideration of the conceptual elements of recovery, its measurement, and maximising and demonstrating organizational commitment. PMID:23764121

  1. The Vanishing Moment Method for Fully Nonlinear Second Order Partial Differential Equations: Formulation, Theory, and Numerical Analysis

    E-print Network

    Feng, Xiaobing

    2011-01-01

    The vanishing moment method was introduced by the authors in [37] as a reliable methodology for computing viscosity solutions of fully nonlinear second order partial differential equations (PDEs), in particular, using Galerkin-type numerical methods such as finite element methods, spectral methods, and discontinuous Galerkin methods, a task which has not been practicable in the past. The crux of the vanishing moment method is the simple idea of approximating a fully nonlinear second order PDE by a family (parametrized by a small parameter $\\vepsi$) of quasilinear higher order (in particular, fourth order) PDEs. The primary objectives of this book are to present a detailed convergent analysis for the method in the radial symmetric case and to carry out a comprehensive finite element numerical analysis for the vanishing moment equations (i.e., the regularized fourth order PDEs). Abstract methodological and convergence analysis frameworks of conforming finite element methods and mixed finite element methods are ...

  2. Adaptive computational methods for aerothermal heating analysis

    NASA Technical Reports Server (NTRS)

    Price, John M.; Oden, J. Tinsley

    1988-01-01

    The development of adaptive gridding techniques for finite-element analysis of fluid dynamics equations is described. The developmental work was done with the Euler equations with concentration on shock and inviscid flow field capturing. Ultimately this methodology is to be applied to a viscous analysis for the purpose of predicting accurate aerothermal loads on complex shapes subjected to high speed flow environments. The development of local error estimate strategies as a basis for refinement strategies is discussed, as well as the refinement strategies themselves. The application of the strategies to triangular elements and a finite-element flux-corrected-transport numerical scheme are presented. The implementation of these strategies in the GIM/PAGE code for 2-D and 3-D applications is documented and demonstrated.

  3. Analysis Resistant Cipher Method and Apparatus

    NASA Technical Reports Server (NTRS)

    Oakley, Ernest C. (Inventor)

    2009-01-01

    A system for encoding and decoding data words including an anti-analysis encoder unit for receiving an original plaintext and producing a recoded data, a data compression unit for receiving the recoded data and producing a compressed recoded data, and an encryption unit for receiving the compressed recoded data and producing an encrypted data. The recoded data has an increased non-correlatable data redundancy compared with the original plaintext in order to mask the statistical distribution of characters in the plaintext data. The system of the present invention further includes a decryption unit for receiving the encrypted data and producing a decrypted data, a data decompression unit for receiving the decrypted data and producing an uncompressed recoded data, and an anti-analysis decoder unit for receiving the uncompressed recoded data and producing a recovered plaintext that corresponds with the original plaintext.

  4. Method Analysis of Microbial Resistant Gypsum Products

    EPA Science Inventory

    Abstract: Several commercially available gypsum products are marketed as microbial-resistant. During previous test method research on a microbial resistant gypsum wallboard study, a common theme from both stakeholders and product vendors was the need for a unified and accepted m...

  5. Combining the soilwater balance and water-level fluctuation methods to estimate natural groundwater recharge: Practical aspects

    USGS Publications Warehouse

    Sophocleous, M.A.

    1991-01-01

    A relatively simple and practical approach for calculating groundwater recharge in semiarid plain environments with a relatively shallow water table, such as the Kansas Prairies, is outlined. Major uncertainties in the Darcian, water balance, and groundwater fluctuation analysis approaches are outlined, and a combination methodology for reducing some of the uncertainties is proposed. By combining a storm-based soilwater balance (lasting several days) with the resulting water table rise, effective storativity values of the region near the water table are obtained. This combination method is termed the 'hybrid water-fluctuation method'. Using a simple average of several such estimates results in a site-calibrated effective storativity value that can be used to translate each major water-table rise tied to a specific storm period into a corresponding amount of groundwater recharge. Examples of soilwater balance and water-level fluctuation analyses based on field-measured data from Kansas show that the proposed methodology gives better and more reliable results than either of the two well-established approaches used singly. ?? 1991.

  6. Advanced stress analysis methods applicable to turbine engine structures

    NASA Technical Reports Server (NTRS)

    Pian, T. H. H.

    1985-01-01

    Advanced stress analysis methods applicable to turbine engine structures are investigated. Constructions of special elements which containing traction-free circular boundaries are investigated. New versions of mixed variational principle and version of hybrid stress elements are formulated. A method is established for suppression of kinematic deformation modes. semiLoof plate and shell elements are constructed by assumed stress hybrid method. An elastic-plastic analysis is conducted by viscoplasticity theory using the mechanical subelement model.

  7. Hydrologic analysis for selection and placement of conservation practices at the watershed scale

    NASA Astrophysics Data System (ADS)

    Wilson, C.; Brooks, E. S.; Boll, J.

    2012-12-01

    When a water body is exceeding water quality standards and a Total Maximum Daily Load has been established, conservation practices in the watershed are able to reduce point and non-point source pollution. Hydrological analysis is needed to place conservation practices in the most hydrologically sensitive areas. The selection and placement of conservation practices, however, is challenging in ungauged watersheds with little or no data for the hydrological analysis. The objective of this research is to perform a hydrological analysis for mitigation of erosion and total phosphorus in a mixed land use watershed, and to select and place the conservation practices in the most sensitive areas. The study area is the Hangman Creek watershed in Idaho and Washington State, upstream of Long Lake (WA) reservoir, east of Spokane, WA. While the pollutant of concern is total phosphorus (TP), reductions in TP were translated to total suspended solids or reductions in nonpoint source erosion and sediment delivery to streams. Hydrological characterization was done with a simple web-based tool, which runs the Water Erosion Prediction Project (WEPP) model for representative land types in the watersheds, where a land type is defined as a unique combination of soil type, slope configuration, land use and management, and climate. The web-based tool used site-specific spatial and temporal data on land use, soil physical parameters, slope, and climate derived from readily available data sources and provided information on potential pollutant pathways (i.e. erosion, runoff, lateral flow, and percolation). Multiple land types representative in the watershed were ordered from most effective to least effective, and displayed spatially using GIS. The methodology for the Hangman Creek watershed was validated in the nearby Paradise Creek watershed that has long-term stream discharge and monitoring as well as land use data. Output from the web-based tool shows the potential reductions for different tillage practices, buffer strips, streamside management, and conversion to the conservation reserve program in the watershed. The output also includes the relationship between land area where conservation practices are placed and the potential reduction in pollution, showing the diminished returns on investment as less sensitive areas are being treated. This application of a simple web-based tool and the use of a physically-based erosion model (i.e. WEPP) illustrates that quantitative, spatial and temporal analysis of changes in pollutant loading and site-specific recommendations of conservation practices can be made in ungauged watersheds.

  8. Linguistically Diverse Students and Special Education: A Mixed Methods Study of Teachers' Attitudes, Coursework, and Practice

    ERIC Educational Resources Information Center

    Greenfield, Renee A.

    2011-01-01

    While the number of linguistically diverse students (LDS) grows steadily in the U.S., schools, research and practice to support their education lag behind (Lucas & Grinberg, 2008). Research that describes the attitudes and practices of teachers who serve LDS and how those attitudes and practice intersect with language and special education is…

  9. Perceptions and practices of commensality and solo-eating among Korean and Japanese university students: A cross-cultural analysis

    PubMed Central

    Cho, Wookyoun; Oh, Yujin; Aiba, Naomi; Lee, Youngmee

    2015-01-01

    BACKGROUND/OBJECTIVES Commensality, eating together with others, is a major representation of human sociality. In recent time, environments around commensality have changed significantly due to rapid social changes, and the decline of commensality is perceived as a serious concern in many modern societies. This study employs a cross-cultural analysis of university students in two East Asian countries, and examines cross-cultural variations of perceptions and actual practices of commensality and solo-eating. SUBJECTS/METHODS The analysis was drawn from a free-list survey and a self-administrative questionnaires of university students in urban Korea and Japan. The free-listing survey was conducted with a small cohort to explore common images and meanings of commensality and solo-eating. The self-administrative questionnaire was developed based on the result of the free-list survey, and conducted with a larger cohort to examine reasons and problems of practices and associated behaviors and food intake. RESULTS We found that Korean subjects tended to show stronger associations between solo-eating and negative emotions while the Japanese subjects expressed mixed emotions towards the practice of solo-eating. In the questionnaire, more Korean students reported they prefer commensality and tend to eat more quantities when they eat commensally. In contrast, more Japanese reported that they do not have preference on commensality and there is no notable difference in food quantities when they eat commensally and alone. Compared to the general Korean cohort finding, more proportion of overweight and obese groups of Korean subjects reported that they tend to eat more when they are alone than normal and underweight groups. This difference was not found in the overweight Japanese subjects. CONCLUSION Our study revealed cross-cultural variations of perceptions and practices of commensality and solo-eating in a non-western setting. PMID:26425283

  10. METHOD OF CHEMICAL ANALYSIS FOR OIL SHALE WASTES

    EPA Science Inventory

    Several methods of chemical analysis are described for oil shale wastewaters and retort gases. These methods are designed to support the field testing of various pollution control systems. As such, emphasis has been placed on methods which are rapid and sufficiently rugged to per...

  11. Development and analysis of atomistic-to-continuum coupling methods

    E-print Network

    Ciocan-Fontanine, Ionut

    analysis has clarified the relation between the various methods and the sources of error. The development of coupling methods for crystalline materials that are reliable and accurate for configura- tions near and to propose more reliable, accurate, and efficient methods. 1 #12;

  12. Combined Finite Element --Finite Volume Method ( Convergence Analysis )

    E-print Network

    Magdeburg, Universität

    Combined Finite Element -- Finite Volume Method ( Convergence Analysis ) M'aria Luk idea is to combine finite volume and finite element methods in an appropriate way. Thus nonlinear grid. Diffusion terms are discretized by the conforming piecewise linear finite element method

  13. ANALYSIS OF SOME MOVING SPACETIME FINITE ELEMENT METHODS \\Lambda

    E-print Network

    Bank, Randolph E.

    ANALYSIS OF SOME MOVING SPACE­TIME FINITE ELEMENT METHODS \\Lambda RANDOLPH E. BANK y AND RAFAEL F. SANTOS z Abstract. Two space­time finite element methods for solving time­dependent partial differential equations are defined and analyzed. The methods are based on the use of isoparametric finite elements

  14. Technology transfer through a network of standard methods and recommended practices - The case of petrochemicals

    NASA Astrophysics Data System (ADS)

    Batzias, Dimitris F.; Karvounis, Sotirios

    2012-12-01

    Technology transfer may take place in parallel with cooperative action between companies participating in the same organizational scheme or using one another as subcontractor (outsourcing). In this case, cooperation should be realized by means of Standard Methods and Recommended Practices (SRPs) to achieve (i) quality of intermediate/final products according to specifications and (ii) industrial process control as required to guarantee such quality with minimum deviation (corresponding to maximum reliability) from preset mean values of representative quality parameters. This work deals with the design of the network of SRPs needed in each case for successful cooperation, implying also the corresponding technology transfer, effectuated through a methodological framework developed in the form of an algorithmic procedure with 20 activity stages and 8 decision nodes. The functionality of this methodology is proved by presenting the path leading from (and relating) a standard test method for toluene, as petrochemical feedstock in the toluene diisocyanate production, to the (6 generations distance upstream) performance evaluation of industrial process control systems (ie., from ASTM D5606 to BS EN 61003-1:2004 in the SRPs network).

  15. Vitamin D Status of Clinical Practice Populations at Higher Latitudes: Analysis and Applications

    PubMed Central

    Genuis, Stephen J.; Schwalfenberg, Gerry K.; Hiltz, Michelle N.; Vaselenak, Sharon A.

    2009-01-01

    Background: Inadequate levels of vitamin D (VTD) throughout the life cycle from the fetal stage to adulthood have been correlated with elevated risk for assorted health afflictions. The purpose of this study was to ascertain VTD status and associated determinants in three clinical practice populations living in Edmonton, Alberta, Canada - a locale with latitude of 53°30’N, where sun exposure from October through March is often inadequate to generate sufficient vitamin D. Methods: To determine VTD status, 1,433 patients from three independent medical offices in Edmonton had levels drawn for 25(OH)D as part of their medical assessment between Jun 2001 and Mar 2007. The relationship between demographic data and lifestyle parameters with VTD status was explored. 25(OH)D levels were categorized as follows: (1) Deficient: <40 nmol/L; (2) Insufficient (moderate to mild): 40 to <80 nmol/L; and (3) Adequate: 80–250 nmol/L. Any cases <25 nmol/L were subcategorized as severely deficient for purposes of further analysis. Results: 240 (16.75% of the total sample) of 1,433 patients were found to be VTD ‘deficient’ of which 48 (3.35% of the overall sample) had levels consistent with severe deficiency. 738 (51.5% of the overall sample) had ‘insufficiency’ (moderate to mild) while only 31.75% had ‘adequate’ 25(OH)D levels. The overall mean for 25(OH) D was 68.3 with SD=28.95. VTD status was significantly linked with demographic and lifestyle parameters including skin tone, fish consumption, milk intake, sun exposure, tanning bed use and nutritional supplementation. Conclusion: A high prevalence of hypovitaminosis-D was found in three clinical practice populations living in Edmonton. In view of the potential health sequelae associated with widespread VTD inadequacy, strategies to facilitate translation of emerging epidemiological information into clinical intervention need to be considered in order to address this public health issue. A suggested VTD supplemental intake level is presented for consideration. PMID:19440275

  16. The Microboudin Method: a New Paleostress Analysis

    NASA Astrophysics Data System (ADS)

    Kimura, N.; Okamoto, A.; Masuda, T.

    2007-12-01

    The microboudin method has been recently established as a new paleopiezometer. The method is successfully applied to monomineralic metamorphic tectonites such as metacherts and marbles, which include the microboudinage structures of columnar minerals (tourmaline, epidote etc.). The absolute magnitude of the far- field paleodifferential stress ?0 is estimated from the two parameters: the dimensionless stress parameter ?, and the extensional fracture strength S0* of micrometer-scale columnar mineral grains. ? is derived from the mechanical model for microboudinage based on the shear-lag model and the Weibull theory. The value of S0* for tourmaline and epidote were obtained from directly measured flexural strength by considering the size and shape of these minerals and influence of time on fracturing. The values of ?0 estimated by this method for samples from Aksu (China), Eskisehir (Turkey), Wadi Tayin (Sultanate of Oman), Asemi (Japan) Greenbushes (Australia), and Sausar (India) are in the range of 1-15 MPa. These ?0-values correspond to the differential stress at mid-crustal levels (> 10 km).

  17. Nurses’ self-efficacy and practices relating to weight management of adult patients: a path analysis

    PubMed Central

    2013-01-01

    Background Health professionals play a key role in the prevention and treatment of excess weight and obesity, but many have expressed a lack of confidence in their ability to manage obese patients with their delivery of weight-management care remaining limited. The specific mechanism underlying inadequate practices in professional weight management remains unclear. The primary purpose of this study was to examine a self-efficacy theory-based model in understanding Registered Nurses’ (RNs) professional performance relating to weight management. Methods A self-report questionnaire was developed based upon the hypothesized model and administered to a convenience sample of 588 RNs. Data were collected regarding socio-demographic variables, psychosocial variables (attitudes towards obese people, professional role identity, teamwork beliefs, perceived skills, perceived barriers and self-efficacy) and professional weight management practices. Structural equation modeling was conducted to identify correlations between the above variables and to test the goodness of fit of the proposed model. Results The survey response rate was 71.4% (n?=?420). The respondents reported a moderate level of weight management practices. Self-efficacy directly and positively predicted the weight management practices of the RNs (??=?0.36, p?practices. The final model constructed in this study demonstrated a good fit to the data [?2 (14) =13.90, p?=?0.46; GFI?=?0.99; AGFI?=?0.98; NNFI?=?1.00; CFI?=?1.00; RMSEA?=?0.00; AIC?=?57.90], accounting for 38.4% and 43.2% of the variance in weight management practices and self-efficacy, respectively. Conclusions Self-efficacy theory appears to be useful in understanding the weight management practices of RNs. Interventions targeting the enhancement of self-efficacy may be effective in promoting RNs’ professional performance in managing overweight and obese patients. PMID:24304903

  18. Practical experience with the input/loss method as applied to a CFB power plant

    SciTech Connect

    Deihl, B.; Lang, F.D.

    1999-07-01

    In late 1995 the Input/Loss Method was installed for on-line monitoring of an independent power producer located in Colver, PA. Colver is a 115 MWe Circulating Fluidized Bed (CFB) steam generator burning poor quality coal, having typically {+-}20% variation in As-Fired heating value, providing considerable difficulties to operating staff. The Input/Loss Method provides a complete thermal understanding of a power plant through explicit determinations of fuel flow, emission flows, fuel chemistry, fuel heating value and thermal efficiency. Direct measurements of fuel or emission flows are not made. In addition, the Method employs a Fuel Consumption Index (FCI) technology to alert the operator as to which components/processes within the system have higher irreversible losses--in terms of higher fuel consumptions for a given power level--thus where improved heat rate can be found. The Method also uses a Sulfur Function Optimizer (SFO) parameter, which assist the operator in minimizing the use of limestone, while meeting regulatory SO{sub 2} effluents. This paper discusses a number of actual operational situations, taken over the past four years, which were resolved with the help of the Input/Loss Method. For example: the usefulness of the SFO parameter, tracking FCI for key processes, plant-evaluated economics, etc. Problems with required effluent (CEMS) instrumentation, experience with a CEMS error analysis procedure, stoichiometric assumptions, air leakage assumptions, etc. are discussed.

  19. A Case Study of Architecting Security Requirements in Practice: Initial Analysis Vidya Lakshminarayanan, WenQian Liu , Charles L Chen, Dewayne E Perry

    E-print Network

    Perry, Dewayne E.

    A Case Study of Architecting Security Requirements in Practice: Initial Analysis Vidya architects manage security requirements in practice is a necessary first step in providing repeatable results of multiple cases of practicing security architects: key aspects in security requirements

  20. Learning in the Permaculture Community of Practice in England: An Analysis of the Relationship between Core Practices and Boundary Processes

    ERIC Educational Resources Information Center

    Ingram, Julie; Maye, Damian; Kirwan, James; Curry, Nigel; Kubinakova, Katarina

    2014-01-01

    Purpose: This article utilizes the Communities of Practice (CoP) framework to examine learning processes among a group of permaculture practitioners in England, specifically examining the balance between core practices and boundary processes. Design/methodology/approach: The empirical basis of the article derives from three participatory workshops…

  1. Spectral analysis method for detecting an element

    DOEpatents

    Blackwood, Larry G [Idaho Falls, ID; Edwards, Andrew J [Idaho Falls, ID; Jewell, James K [Idaho Falls, ID; Reber, Edward L [Idaho Falls, ID; Seabury, Edward H [Idaho Falls, ID

    2008-02-12

    A method for detecting an element is described and which includes the steps of providing a gamma-ray spectrum which has a region of interest which corresponds with a small amount of an element to be detected; providing nonparametric assumptions about a shape of the gamma-ray spectrum in the region of interest, and which would indicate the presence of the element to be detected; and applying a statistical test to the shape of the gamma-ray spectrum based upon the nonparametric assumptions to detect the small amount of the element to be detected.

  2. A Government/Industry Summary of the Design Analysis Methods for Vibrations (DAMVIBS) Program

    NASA Technical Reports Server (NTRS)

    Kvaternik, Raymond G. (compiler)

    1993-01-01

    The NASA Langley Research Center in 1984 initiated a rotorcraft structural dynamics program, designated DAMVIBS (Design Analysis Methods for VIBrationS), with the objective of establishing the technology base needed by the rotorcraft industry for developing an advanced finite-element-based dynamics design analysis capability for vibrations. An assessment of the program showed that the DAMVIBS Program has resulted in notable technical achievements and major changes in industrial design practice, all of which have significantly advanced the industry's capability to use and rely on finite-element-based dynamics analyses during the design process.

  3. Women's Access and Provider Practices for the Case Management of Malaria during Pregnancy: A Systematic Review and Meta-Analysis

    PubMed Central

    Hill, Jenny; D'Mello-Guyett, Lauren; Hoyt, Jenna; van Eijk, Anna M.; ter Kuile, Feiko O.; Webster, Jayne

    2014-01-01

    Background WHO recommends prompt diagnosis and quinine plus clindamycin for treatment of uncomplicated malaria in the first trimester and artemisinin-based combination therapies in subsequent trimesters. We undertook a systematic review of women's access to and healthcare provider adherence to WHO case management policy for malaria in pregnant women. Methods and Findings We searched the Malaria in Pregnancy Library, the Global Health Database, and the International Network for the Rational Use of Drugs Bibliography from 1 January 2006 to 3 April 2014, without language restriction. Data were appraised for quality and content. Frequencies of women's and healthcare providers' practices were explored using narrative synthesis and random effect meta-analysis. Barriers to women's access and providers' adherence to policy were explored by content analysis using NVivo. Determinants of women's access and providers' case management practices were extracted and compared across studies. We did not perform a meta-ethnography. Thirty-seven studies were included, conducted in Africa (30), Asia (4), Yemen (1), and Brazil (2). One- to three-quarters of women reported malaria episodes during pregnancy, of whom treatment was sought by >85%. Barriers to access among women included poor knowledge of drug safety, prohibitive costs, and self-treatment practices, used by 5%–40% of women. Determinants of women's treatment-seeking behaviour were education and previous experience of miscarriage and antenatal care. Healthcare provider reliance on clinical diagnosis and poor adherence to treatment policy, especially in first versus other trimesters (28%, 95% CI 14%–47%, versus 72%, 95% CI 39%–91%, p?=?0.02), was consistently reported. Prescribing practices were driven by concerns over side effects and drug safety, patient preference, drug availability, and cost. Determinants of provider practices were access to training and facility type (public versus private). Findings were limited by the availability, quality, scope, and methodological inconsistencies of the included studies. Conclusions A systematic assessment of the extent of substandard case management practices of malaria in pregnancy is required, as well as quality improvement interventions that reach all providers administering antimalarial drugs in the community. Pregnant women need access to information on which anti-malarial drugs are safe to use at different stages of pregnancy. Please see later in the article for the Editors' Summary PMID:25093720

  4. Analysis of hemoglobin electrophoresis results and physicians investigative practices in Saudi Arabia

    PubMed Central

    Mehdi, Syed Riaz; Al Dahmash, Badr Abdullah

    2013-01-01

    BACKGROUND AND OBJECTIVES: Riyadh and central province falls in a moderate prevalent zone of hemoglobinopathies in Saudi Arabia. However, it has been observed that the physicians working in Saudi Arabia invariably advise all cases of anemia for hemoglobin electrophoresis (HE). The present work was carried out to study the yield of the HE in Riyadh and the investigative practices of the physicians advising HE. SETTINGS AND DESIGN: The study was carried out in the hospitals of King Saud University from 2009 to 2011 in order to assess the yield of HE in referred cases of clinical anemia. MATERIALS AND METHODS: A total of 1073 cases divided in two groups of males and females had undergone complete blood count and red blood cell morphology. Cellulose acetate HE was performed and all the positive results were reconfirmed on the high performance liquid chromatography (HPLC). The results were analyzed for the type of hemoglobinopathies. For statistical analysis Statistical Package for Social Sciences 15 version (SPSS Inc., Chicago, IL, USA) was used. RESULTS: A total of 405 males and 668 females blood samples were included in the present study. 116 (28.5%) males and 167 (25%) females showed an abnormal pattern on HE. The incidence of beta thalassemia trait was higher in females while sickle cell trait was predominantly seen in males. Red cell indices were reduced considerably in thalassemias, but were unaffected in sickle cell disorders, except those which had concurrent alpha trait. The total yield of HE was 26.6% which was much less than expected. CONCLUSION: The physicians are advised to rule out iron deficiency and other common causes of anemia before investigating the cases for hemoglobinopathies, which employs time consuming and expensive tests of HE and HPLC. PMID:24339548

  5. Bayes Method Plant Aging Risk Analysis

    Energy Science and Technology Software Center (ESTSC)

    1992-03-13

    DORIAN is an integrated package for performing Bayesian aging analysis of reliability data; e.g. for identifying trends in component failure rates and/or outage durations as a function of time. The user must specify several alternatives hypothesized aging models (i.e. possible trends) along with prior probabilities indicating the subjective probability that each trend is actually the correct one. DORIAN then uses component failure and/or repair data over time to update these prior probabilities and develop amore »posterior probability for each aging model, representing the probability that each model is the correct one in light of the observed data rather than a priori. Mean, median, and 5th and 95th percentile trends are also compiled from the posterior probabilities.« less

  6. Improved dynamic analysis method using load-dependent Ritz vectors

    NASA Technical Reports Server (NTRS)

    Escobedo-Torres, J.; Ricles, J. M.

    1993-01-01

    The dynamic analysis of large space structures is important in order to predict their behavior under operating conditions. Computer models of large space structures are characterized by having a large number of degrees of freedom, and the computational effort required to carry out the analysis is very large. Conventional methods of solution utilize a subset of the eigenvectors of the system, but for systems with many degrees of freedom, the solution of the eigenproblem is in many cases the most costly phase of the analysis. For this reason, alternate solution methods need to be considered. It is important that the method chosen for the analysis be efficient and that accurate results be obtainable. It is important that the method chosen for the analysis be efficient and that accurate results be obtainable. The load dependent Ritz vector method is presented as an alternative to the classical normal mode methods for obtaining dynamic responses of large space structures. A simplified model of a space station is used to compare results. Results show that the load dependent Ritz vector method predicts the dynamic response better than the classical normal mode method. Even though this alternate method is very promising, further studies are necessary to fully understand its attributes and limitations.

  7. Nonlinear analysis of structures. [within framework of finite element method

    NASA Technical Reports Server (NTRS)

    Armen, H., Jr.; Levine, H.; Pifko, A.; Levy, A.

    1974-01-01

    The development of nonlinear analysis techniques within the framework of the finite-element method is reported. Although the emphasis is concerned with those nonlinearities associated with material behavior, a general treatment of geometric nonlinearity, alone or in combination with plasticity is included, and applications presented for a class of problems categorized as axisymmetric shells of revolution. The scope of the nonlinear analysis capabilities includes: (1) a membrane stress analysis, (2) bending and membrane stress analysis, (3) analysis of thick and thin axisymmetric bodies of revolution, (4) a general three dimensional analysis, and (5) analysis of laminated composites. Applications of the methods are made to a number of sample structures. Correlation with available analytic or experimental data range from good to excellent.

  8. Extraction of brewer's yeasts using different methods of cell disruption for practical biodiesel production.

    PubMed

    ?ezanka, Tomáš; Matoulková, Dagmar; Kolouchová, Irena; Masák, Jan; Viden, Ivan; Sigler, Karel

    2015-05-01

    The methods of preparation of fatty acids from brewer's yeast and its use in production of biofuels and in different branches of industry are described. Isolation of fatty acids from cell lipids includes cell disintegration (e.g., with liquid nitrogen, KOH, NaOH, petroleum ether, nitrogenous basic compounds, etc.) and subsequent processing of extracted lipids, including analysis of fatty acid and computing of biodiesel properties such as viscosity, density, cloud point, and cetane number. Methyl esters obtained from brewer's waste yeast are well suited for the production of biodiesel. All 49 samples (7 breweries and 7 methods) meet the requirements for biodiesel quality in both the composition of fatty acids and the properties of the biofuel required by the US and EU standards. PMID:25394535

  9. Method for photon activation positron annihilation analysis

    DOEpatents

    Akers, Douglas W.

    2006-06-06

    A non-destructive testing method comprises providing a specimen having at least one positron emitter therein; determining a threshold energy for activating the positron emitter; and determining whether a half-life of the positron emitter is less than a selected half-life. If the half-life of the positron emitter is greater than or equal to the selected half-life, then activating the positron emitter by bombarding the specimen with photons having energies greater than the threshold energy and detecting gamma rays produced by annihilation of positrons in the specimen. If the half-life of the positron emitter is less then the selected half-life, then alternately activating the positron emitter by bombarding the specimen with photons having energies greater then the threshold energy and detecting gamma rays produced by positron annihilation within the specimen.

  10. Analysis of the transient response of shell structures by numerical methods.

    NASA Technical Reports Server (NTRS)

    Geers, T. L.; Sobel, L. H.

    1972-01-01

    This paper examines some basic considerations underlying dynamic shell response analysis and the impact of these considerations upon the practical aspects of solution by numerical methods. Emphasis is placed on the solution of linear problems. The present states of development of the finite difference and finite element methods are reviewed, and techniques for the treatment of temporal variation are discussed. An examination is made of the frequency parameters characteristic of thin shell theory, applied excitations, and spatial mesh geometries, and the significance of these parameters with respect to computational convergence is illustrated.

  11. Implementation of infection control best practice in intensive care units throughout Europe: a mixed-method evaluation study

    PubMed Central

    2013-01-01

    Background The implementation of evidence-based infection control practices is essential, yet challenging for healthcare institutions worldwide. Although acknowledged that implementation success varies with contextual factors, little is known regarding the most critical specific conditions within the complex cultural milieu of varying economic, political, and healthcare systems. Given the increasing reliance on unified global schemes to improve patient safety and healthcare effectiveness, research on this topic is needed and timely. The ‘InDepth’ work package of the European FP7 Prevention of Hospital Infections by Intervention and Training (PROHIBIT) consortium aims to assess barriers and facilitators to the successful implementation of catheter-related bloodstream infection (CRBSI) prevention in intensive care units (ICU) across several European countries. Methods We use a qualitative case study approach in the ICUs of six purposefully selected acute care hospitals among the 15 participants in the PROHIBIT CRBSI intervention study. For sensitizing schemes we apply the theory of diffusion of innovation, published implementation frameworks, sensemaking, and new institutionalism. We conduct interviews with hospital health providers/agents at different organizational levels and ethnographic observations, and conduct rich artifact collection, and photography during two rounds of on-site visits, once before and once one year into the intervention. Data analysis is based on grounded theory. Given the challenge of different languages and cultures, we enlist the help of local interpreters, allot two days for site visits, and perform triangulation across multiple data sources. Qualitative measures of implementation success will consider the longitudinal interaction between the initiative and the institutional context. Quantitative outcomes on catheter-related bloodstream infections and performance indicators from another work package of the consortium will produce a final mixed-methods report. Conclusion A mixed-methods study of this scale with longitudinal follow-up is unique in the field of infection control. It highlights the ‘Why’ and ‘How’ of best practice implementation, revealing key factors that determine success of a uniform intervention in the context of several varying cultural, economic, political, and medical systems across Europe. These new insights will guide future implementation of more tailored and hence more successful infection control programs. Trial registration Trial number: PROHIBIT-241928 (FP7 reference number) PMID:23421909

  12. PRACTICAL EXPERIENCE IN ANALYSIS OF ORGANIC COMPOUNDS IN AMBIENTAIR USING CANISTERS AND SORBENTS

    EPA Science Inventory

    Generation of accurate ambient air VOC pollutant mcasurement dataas a base for regulatory decisions is critical. umerous methodsand procedures for sampling and analysis are available from avariety of sources. ir methods available through theEnvironmental Protection Agency are con...

  13. Structural Analysis Using Computer Based Methods

    NASA Technical Reports Server (NTRS)

    Dietz, Matthew R.

    2013-01-01

    The stiffness of a flex hose that will be used in the umbilical arms of the Space Launch Systems mobile launcher needed to be determined in order to properly qualify ground umbilical plate behavior during vehicle separation post T-0. This data is also necessary to properly size and design the motors used to retract the umbilical arms. Therefore an experiment was created to determine the stiffness of the hose. Before the test apparatus for the experiment could be built, the structure had to be analyzed to ensure it would not fail under given loading conditions. The design model was imported into the analysis software and optimized to decrease runtime while still providing accurate restlts and allow for seamless meshing. Areas exceeding the allowable stresses in the structure were located and modified before submitting the design for fabrication. In addition, a mock up of a deep space habitat and the support frame was designed and needed to be analyzed for structural integrity under different loading conditions. The load cases were provided by the customer and were applied to the structure after optimizing the geometry. Once again, weak points in the structure were located and recommended design changes were made to the customer and the process was repeated until the load conditions were met without exceeding the allowable stresses. After the stresses met the required factors of safety the designs were released for fabrication.

  14. Dietary Diversity and Meal Frequency Practices among Infant and Young Children Aged 6–23 Months in Ethiopia: A Secondary Analysis of Ethiopian Demographic and Health Survey 2011

    PubMed Central

    Aemro, Melkam; Mesele, Molla; Birhanu, Zelalem; Atenafu, Azeb

    2013-01-01

    Background. Appropriate complementary feeding practice is essential for growth and development of children. This study aimed to assess dietary diversity and meal frequency practice of infants and young children in Ethiopia. Methods. Data collected in the Ethiopian Demographic and Health Survey (EDHS) from December 2010 to June 2011 were used for this study. Data collected were extracted, arranged, recoded, and analyzed by using SPSS version 17. A total of 2836 children aged 6–23 months were used for final analysis. Both bivariate and multivariate analysis were done to identify predictors of feeding practices. Result. Children with adequate dietary diversity score and meal frequency were 10.8% and 44.7%, respectively. Children born from the richest households showed better dietary diversity score (OR = 0.256). Number of children whose age less than five years was important predictor of dietary diversity (OR = 0.690). Mothers who had exposure to media were more likely to give adequate meal frequency to their children (OR = 0.707). Conclusion. Dietary diversity and meal frequency practices were inadequate in Ethiopia. Wealth quintile, exposure to media, and number of children were affecting feeding practices. Improving economic status, a habit of eating together, and exposure to media are important to improve infant feeding practices in Ethiopia. PMID:24455218

  15. Methods for the survey and genetic analysis of populations

    DOEpatents

    Ashby, Matthew

    2003-09-02

    The present invention relates to methods for performing surveys of the genetic diversity of a population. The invention also relates to methods for performing genetic analyses of a population. The invention further relates to methods for the creation of databases comprising the survey information and the databases created by these methods. The invention also relates to methods for analyzing the information to correlate the presence of nucleic acid markers with desired parameters in a sample. These methods have application in the fields of geochemical exploration, agriculture, bioremediation, environmental analysis, clinical microbiology, forensic science and medicine.

  16. Improved modelling method of Pogo analysis and simulation for liquid rockets

    NASA Astrophysics Data System (ADS)

    Wang, Qingwei; Tan, Shujun; Wu, Zhigang; Yang, Yunfei; Yu, Ziwen

    2015-02-01

    An improved modelling method for Pogo analysis and simulation is presented which derives differential equations instead of the differential-algebraic equations by Rubin's method. Firstly the classification mechanism based on the description of independent weight-displacements is proposed and nine types of independent elements are derived by division and combination of the basic physical elements. Then the differential equations of the propulsion system are established using assembly conditions that perturbational pressures on the nodes are equal to each other. The differential equations of the Pogo analysis model are then obtained by coupling the structure vibration equations and the propulsion system equations. Since the improved Pogo analysis model does not contain algebraic equations, it can be directly used for time-domain simulation and frequency-domain analysis. Moreover, the dimensions of the improved Pogo analysis model are reduced to almost half of the ones of Rubin's method due to the elimination of variables of perturbational pressures, which significantly improves the numerical efficiency and stability. Finally, analysis and simulation results of a certain type of CZ rocket of China confirm the correctness and practicability of the proposed modelling method.

  17. Implementing a Virtual Community of Practice for Family Physician Training: A Mixed-Methods Case Study

    PubMed Central

    Jones, Sandra C; Caton, Tim; Iverson, Don; Bennett, Sue; Robinson, Laura

    2014-01-01

    Background GP training in Australia can be professionally isolating, with trainees spread across large geographic areas, leading to problems with rural workforce retention. Virtual communities of practice (VCoPs) may provide a way of improving knowledge sharing and thus reducing professional isolation. Objective The goal of our study was to review the usefulness of a 7-step framework for implementing a VCoP for general practitioner (GP) training and then evaluated the usefulness of the resulting VCoP in facilitating knowledge sharing and reducing professional isolation. Methods The case was set in an Australian general practice training region involving 55 first-term trainees (GPT1s), from January to July 2012. ConnectGPR was a secure, online community site that included standard community options such as discussion forums, blogs, newsletter broadcasts, webchats, and photo sharing. A mixed-methods case study methodology was used. Results are presented and interpreted for each step of the VCoP 7-step framework and then in terms of the outcomes of knowledge sharing and overcoming isolation. Results Step 1, Facilitation: Regular, personal facilitation by a group of GP trainers with a co-ordinating facilitator was an important factor in the success of ConnectGPR. Step 2, Champion and Support: Leadership and stakeholder engagement were vital. Further benefits are possible if the site is recognized as contributing to training time. Step 3, Clear Goals: Clear goals of facilitating knowledge sharing and improving connectedness helped to keep the site discussions focused. Step 4, A Broad Church: The ConnectGPR community was too narrow, focusing only on first-term trainees (GPT1s). Ideally there should be more involvement of senior trainees, trainers, and specialists. Step 5, A Supportive Environment: Facilitators maintained community standards and encouraged participation. Step 6, Measurement Benchmarking and Feedback: Site activity was primarily driven by centrally generated newsletter feedback. Viewing comments by other participants helped users benchmark their own knowledge, particularly around applying guidelines. Step 7, Technology and Community: All the community tools were useful, but chat was limited and users suggested webinars in future. A larger user base and more training may also be helpful. Time is a common barrier. Trust can be built online, which may have benefit for trainees that cannot attend face-to-face workshops. Knowledge sharing and isolation outcomes: 28/34 (82%) of the eligible GPT1s enrolled on ConnectGPR. Trainees shared knowledge through online chat, forums, and shared photos. In terms of knowledge needs, GPT1s rated their need for cardiovascular knowledge more highly than supervisors. Isolation was a common theme among interview respondents, and ConnectGPR users felt more supported in their general practice (13/14, 92.9%). Conclusions The 7-step framework for implementation of an online community was useful. Overcoming isolation and improving connectedness through an online knowledge sharing community shows promise in GP training. Time and technology are barriers that may be overcome by training, technology, and valuable content. In a VCoP, trust can be built online. This has implications for course delivery, particularly in regional areas. VCoPs may also have a specific role assisting overseas trained doctors to interpret their medical knowledge in a new context. PMID:24622292

  18. Apparatus and method for spectroscopic analysis of scattering media

    DOEpatents

    Strobl, Karlheinz (Los Angeles, CA); Bigio, Irving J. (Los Alamos, NM); Loree, Thomas R. (Santa Fe, NM)

    1994-01-01

    Apparatus and method for spectroscopic analysis of scattering media. Subtle differences in materials have been found to be detectable from plots of intensity as a function of wavelength of collected emitted and scattered light versus wavelength of excitation light.

  19. Analysis methods for meso- and macroporous silicon etching baths

    NASA Astrophysics Data System (ADS)

    Nehmann, Julia B.; Kajari-Schröder, Sarah; Bahnemann, Detlef W.

    2012-07-01

    Analysis methods for electrochemical etching baths consisting of various concentrations of hydrofluoric acid (HF) and an additional organic surface wetting agent are presented. These electrolytes are used for the formation of meso- and macroporous silicon. Monitoring the etching bath composition requires at least one method each for the determination of the HF concentration and the organic content of the bath. However, it is a precondition that the analysis equipment withstands the aggressive HF. Titration and a fluoride ion-selective electrode are used for the determination of the HF and a cuvette test method for the analysis of the organic content, respectively. The most suitable analysis method is identified depending on the components in the electrolyte with the focus on capability of resistance against the aggressive HF.

  20. Comparison of spectral analysis methods for characterizing brain oscillations

    PubMed Central

    van Vugt, Marieke K.; Sederberg, Per B.; Kahana, Michael J.

    2007-01-01

    Spectral analysis methods are now routinely used in electrophysiological studies of human and animal cognition. Although a wide variety of spectral methods has been used, the ways in which these methods differ are not generally understood. Here we use simulation methods to characterize the similarities and differences between three spectral analysis methods: wavelets, multitapers and Pepisode. Pepisode is a novel method that quantifies the fraction of time that oscillations exceed amplitude and duration thresholds. We show that wavelets and Pepisode used side-by-side helps to disentangle length and amplitude of a signal. Pepisode is especially sensitive to fluctuations around its thresholds, puts frequencies on a more equal footing, and is sensitive to long but low-amplitude signals. In contrast, multitaper methods are less sensitive to weak signals, but are very frequency-specific. If frequency-specificity is not essential, then wavelets and Pepisode are recommended. PMID:17292478

  1. Strategies and Practices in Off-Label Marketing of Pharmaceuticals: A Retrospective Analysis of Whistleblower Complaints

    PubMed Central

    Kesselheim, Aaron S.; Mello, Michelle M.; Studdert, David M.

    2011-01-01

    Background Despite regulatory restrictions, off-label marketing of pharmaceutical products has been common in the US. However, the scope of off-label marketing remains poorly characterized. We developed a typology for the strategies and practices that constitute off-label marketing. Methods and Findings We obtained unsealed whistleblower complaints against pharmaceutical companies filed in US federal fraud cases that contained allegations of off-label marketing (January 1996–October 2010) and conducted structured reviews of them. We coded and analyzed the strategic goals of each off-label marketing scheme and the practices used to achieve those goals, as reported by the whistleblowers. We identified 41 complaints arising from 18 unique cases for our analytic sample (leading to US$7.9 billion in recoveries). The off-label marketing schemes described in the complaints had three non–mutually exclusive goals: expansions to unapproved diseases (35/41, 85%), unapproved disease subtypes (22/41, 54%), and unapproved drug doses (14/41, 34%). Manufacturers were alleged to have pursued these goals using four non–mutually exclusive types of marketing practices: prescriber-related (41/41, 100%), business-related (37/41, 90%), payer-related (23/41, 56%), and consumer-related (18/41, 44%). Prescriber-related practices, the centerpiece of company strategies, included self-serving presentations of the literature (31/41, 76%), free samples (8/41, 20%), direct financial incentives to physicians (35/41, 85%), and teaching (22/41, 54%) and research activities (8/41, 20%). Conclusions Off-label marketing practices appear to extend to many areas of the health care system. Unfortunately, the most common alleged off-label marketing practices also appear to be the most difficult to control through external regulatory approaches. Please see later in the article for the Editors' Summary PMID:21483716

  2. Passive sampling methods for contaminated sediments: Practical guidance for selection, calibration, and implementation

    PubMed Central

    Ghosh, Upal; Driscoll, Susan Kane; Burgess, Robert M; Jonker, Michiel To; Reible, Danny; Gobas, Frank; Choi, Yongju; Apitz, Sabine E; Maruya, Keith A; Gala, William R; Mortimer, Munro; Beegan, Chris

    2014-01-01

    This article provides practical guidance on the use of passive sampling methods (PSMs) that target the freely dissolved concentration (Cfree) for improved exposure assessment of hydrophobic organic chemicals in sediments. Primary considerations for selecting a PSM for a specific application include clear delineation of measurement goals for Cfree, whether laboratory-based “ex situ” and/or field-based “in situ” application is desired, and ultimately which PSM is best-suited to fulfill the measurement objectives. Guidelines for proper calibration and validation of PSMs, including use of provisional values for polymer–water partition coefficients, determination of equilibrium status, and confirmation of nondepletive measurement conditions are defined. A hypothetical example is described to illustrate how the measurement of Cfree afforded by PSMs reduces uncertainty in assessing narcotic toxicity for sediments contaminated with polycyclic aromatic hydrocarbons. The article concludes with a discussion of future research that will improve the quality and robustness of Cfree measurements using PSMs, providing a sound scientific basis to support risk assessment and contaminated sediment management decisions. Integr Environ Assess Manag 2014;10:210–223. © 2014 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of SETAC. PMID:24288273

  3. Reflections on Practical Approaches to Involving Children and Young People in the Data Analysis Process

    ERIC Educational Resources Information Center

    Coad, Jane; Evans, Ruth

    2008-01-01

    This article reflects on key methodological issues emerging from children and young people's involvement in data analysis processes. We outline a pragmatic framework illustrating different approaches to engaging children, using two case studies of children's experiences of participating in data analysis. The article highlights methods of…

  4. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.

  5. Job load and hazard analysis: a method for the analysis of workplace conditions for occupational health care.

    PubMed Central

    Mattila, M K

    1985-01-01

    One requirement for successful occupational health care is reliable information on occupational hazards. The aim of this study was to develop a simple, standardised method for workplace investigations for use in occupational health care. The theoretical framework of the method comprises the stress-strain model, the hazard-danger model, and risk behaviour theory. The new method, termed job load and hazard analysis, includes four stages: identification of hazards, their evaluation, conclusions and proposals, and follow up. Different methods are available for hazard identification. The identification starts with a rough analysis of five factors, chemical hazards, physical hazards, physical load, mental stress, and accident risk. Hazards and stress factors are assessed with an ordinal scale. Specialised methods are used if all hazards cannot otherwise be identified. The analytical procedure comprises: detection of hazards through observations and interviews at the workplace and with a questionnaire; assessment of findings as teamwork; and evaluation of the results of these assessments to yield conclusions and proposals made by occupational health care personnel. A data processing system has been developed for data storage and future use. The method has functioned in practice, improving the contents of the occupational health care programme and generating preventive measures. The method offers many new possibilities for controlling occupational hazards and studying relations between working conditions and workers' health. PMID:4041383

  6. Analysis of Biomass Sugars Using a Novel HPLC Method

    SciTech Connect

    Agblevor, F. A.; Hames, B. R.; Schell, D.; Chum, H. L.

    2007-01-01

    The precise quantitative analysis of biomass sugars is a very important step in the conversion of biomass feedstocks to fuels and chemicals. However, the most accurate method of biomass sugar analysis is based on the gas chromatography analysis of derivatized sugars either as alditol acetates or trimethylsilanes. The derivatization method is time consuming but the alternative high-performance liquid chromatography (HPLC) method cannot resolve most sugars found in biomass hydrolysates. We have demonstrated for the first time that by careful manipulation of the HPLC mobile phase, biomass monomeric sugars (arabinose, xylose, fructose, glucose, mannose, and galactose) can be analyzed quantitatively and there is excellent baseline resolution of all the sugars. This method was demonstrated for standard sugars, pretreated corn stover liquid and solid fractions. Our method can also be used to analyze dimeric sugars (cellobiose and sucrose).

  7. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. PMID:26456251

  8. Independent and complementary methods for large-scale structural analysis

    E-print Network

    Yuan, Guo-Cheng "GC"

    Independent and complementary methods for large-scale structural analysis of mammalian chromatin. Richter,4 Daniel G. Peterson,5 Oliver J. Rando,3 William S. Noble,2 and Robert E. Kingston1,7 1 Department large-scale analysis. We validated these assays using the known positions of nucleosomes on the mouse

  9. Scalable Kernel Methods and Algorithms for General Sequence Analysis

    ERIC Educational Resources Information Center

    Kuksa, Pavel

    2011-01-01

    Analysis of large-scale sequential data has become an important task in machine learning and pattern recognition, inspired in part by numerous scientific and technological applications such as the document and text classification or the analysis of biological sequences. However, current computational methods for sequence comparison still lack…

  10. Analysis Methods Supplement for: Classification and Diagnostic Prediction of Cancers

    E-print Network

    Ringnér, Markus

    Analysis Methods Supplement for: Classification and Diagnostic Prediction of Cancers using Gene the image analysis failed. In Fig. 1 the number of genes each sample removes is shown. We used the natural using PCA on relatively few samples is that components might be singled out due to strong noise

  11. EVALUATION OF MODAL COMBINATION METHODS FOR SEISMIC RESPONSE SPECTRUM ANALYSIS.

    SciTech Connect

    MORANTE,R.

    1999-08-15

    Regulatory Guide 1.92 ''Combining Modal Responses and Spatial Components in Seismic Response Analysis'' was last revised in 1976. The objective of this project was to re-evaluate the current regulatory guidance for combining modal responses in response spectrum analysis, evaluate recent technical developments, and recommend revisions to the regulatory guidance. This paper describes the qualitative evaluation of modal response combination methods.

  12. Evaluation of Modal Combination Methods for Seismic Response Spectrum Analysis

    SciTech Connect

    Chokshi, N.; Kenneally, R.; Morante, R.; Norris, W.; Wang, Y.

    1999-03-23

    Regulatory Guide 1.92 ''Combining Modal Responses and Spatial Components in Seismic Response Analysis'' was last revised in 1976. The objective of this project was to re-evaluate the current regulatory guidance for combining modal responses in response spectrum analysis, evaluate recent technical developments, and recommend revisions to the regulatory guidance. This paper describes the qualitative evaluation of modal response combination methods.

  13. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... of Analysis of the Association of Official Analytical Chemists,” which are incorporated by reference in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies may be obtained from the AOAC... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Methods of analysis. 163.5 Section 163.5 Food...

  14. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... of Analysis of the Association of Official Analytical Chemists,” which are incorporated by reference in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies may be obtained from the AOAC... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Methods of analysis. 163.5 Section 163.5 Food...

  15. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... of Analysis of the Association of Official Analytical Chemists,” which are incorporated by reference in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies may be obtained from the AOAC... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Methods of analysis. 163.5 Section 163.5 Food...

  16. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... of Analysis of the Association of Official Analytical Chemists,” which are incorporated by reference in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies may be obtained from the AOAC... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Methods of analysis. 163.5 Section 163.5 Food...

  17. Newton-Krylov Methods in Power Flow and Contingency Analysis

    E-print Network

    Vuik, Kees

    in both operational control and planning of power systems. Essential tools are power flow (or load flow-Krylov Methods in Power Flow and Contingency Analysis Reijer Idema A power system is a system that provides) studies and contingency analysis. In power flow studies, the bus voltages in the power system

  18. AEROACOUSTIC ANALYSIS USING A HYBRID FINITE ELEMENT METHOD

    E-print Network

    Giles, Mike

    generated by the high-bypass turbofan engines in widespread use in modern civil aircraft is an increasing a novel 3D analysis of tone noise radiated from turbofan inlets. This analysis combines a standard finite considerable attention from research. The majority of methods dedicated to high-bypass engine acoustics follow

  19. IMPROVEMENT AND EVALUATION OF METHODS FOR SULFATE ANALYSIS

    EPA Science Inventory

    A simpler and faster procedure for the manual turbidimetric analysis of sulfate has been developed and evaluated. This method as well as a turbidimetric procedure using SulfaVer(R), automated methylthymol blue (MTB) procedures for analysis in the 0-100 micrograms/ml and 0-10 micr...

  20. Latent Class Analysis: A Method for Capturing Heterogeneity

    ERIC Educational Resources Information Center

    Scotto Rosato, Nancy; Baer, Judith C.

    2012-01-01

    Social work researchers often use variable-centered approaches such as regression and factor analysis. However, these methods do not capture important aspects of relationships that are often imbedded in the heterogeneity of samples. Latent class analysis (LCA) is one of several person-centered approaches that can capture heterogeneity within and…