These are representative sample records from related to your search topic.
For comprehensive and current results, perform a real-time search at

A Practical Method of Policy Analysis by Estimating Effect Size  

ERIC Educational Resources Information Center

The previous articles on class size and other productivity research paint a complex and confusing picture of the relationship between policy variables and student achievement. Missing is a conceptual scheme capable of combining the seemingly unrelated research and dissimilar estimates of effect size into a unified structure for policy analysis and…

Phelps, James L.



A Topography Analysis Incorporated Optimization Method for the Selection and Placement of Best Management Practices  

PubMed Central

Best Management Practices (BMPs) are one of the most effective methods to control nonpoint source (NPS) pollution at a watershed scale. In this paper, the use of a topography analysis incorporated optimization method (TAIOM) was proposed, which integrates topography analysis with cost-effective optimization. The surface status, slope and the type of land use were evaluated as inputs for the optimization engine. A genetic algorithm program was coded to obtain the final optimization. The TAIOM was validated in conjunction with the Soil and Water Assessment Tool (SWAT) in the Yulin watershed in Southwestern China. The results showed that the TAIOM was more cost-effective than traditional optimization methods. The distribution of selected BMPs throughout landscapes comprising relatively flat plains and gentle slopes, suggests the need for a more operationally effective scheme, such as the TAIOM, to determine the practicability of BMPs before widespread adoption. The TAIOM developed in this study can easily be extended to other watersheds to help decision makers control NPS pollution. PMID:23349917

Shen, Zhenyao; Chen, Lei; Xu, Liang



A practical method for incorporating Real Options analysis into US federal benefit-cost analysis procedures  

E-print Network

This research identifies how Real Options (RO) thinking might acceptably and effectively complement the current mandates for Benefit-Cost Analysis (BCA) defined by the Office of Management and Budget (OMB) in Circular A-94. ...

Rivey, Darren



Practical dust control methods  

SciTech Connect

At a remediation project in Granite City, Illinois, the presence of lead was detected in and below the surface soil in the form of construction debris and contaminated soil. Contamination was also present in the form of airborne dust as well as surface contamination. The contaminated dust was present on equipment, tools, and the clothing of laborers working inside of the exclusion zone. OHM established an exclusion zone to limit access to the area and required that protective equipment be worn inside of the exclusion zone to allow for more efficient decontamination. Wetting methods were used as well as a foam material which was used to cover larger piles. Formal decontamination methods were implemented to limit the spread of contamination from the exclusion zone. These methods included specific procedures to remove protective equipment, water washing for equipment, and an inspection before leaving the zone. To the extent practical, transportation equipment was staged at the edge of the exclusion zone rather than entering the zone. Plastic tarpaulins were used to collect contaminated debris near the edge of the zone.

Thomas, B.R.; Blassingame, S.R. [OHM Remediation Services Corp., Findlay, OH (United States)



A Practical Guide to Practice Analysis for Credentialing Examinations.  

ERIC Educational Resources Information Center

Offers recommendations for the conduct of practice analysis (i.e., job analysis) concerning these issues: (1) selecting a method of practice analysis; (2) developing rating scales; (3) determining the content of test plans; (4) using multi-variate procedures for structuring test plans; and (5) determining topic weights for test plans. (SLD)

Raymond, Mark R.



Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory?Determination of Trihalomethane Formation Potential, Method Validation, and Quality-Control Practices  

USGS Publications Warehouse

An analytical method for the determination of the trihalomethane formation potential of water samples has been developed. The trihalomethane formation potential is measured by dosing samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine, and then analyzing the resulting trihalomethanes by purge and trap/gas chromatography equipped with an electron capture detector. Detailed explanations of the method and quality-control practices are provided. Method validation experiments showed that the trihalomethane formation potential varies as a function of time between sample collection and analysis, residual-free chlorine concentration, method of sample dilution, and the concentration of bromide in the sample.

Crepeau, Kathryn L.; Fram, Miranda S.; Bush, Noel



Animal Disease Import Risk Analysis - a Review of Current Methods and Practice.  


The application of risk analysis to the spread of disease with international trade in animals and their products, that is, import risk analysis (IRA), has been largely driven by the Sanitary and Phytosanitary (SPS) agreement of the World Trade Organization (WTO). The degree to which the IRA standard established by the World Organization for Animal Health (OIE), and associated guidance, meets the needs of the SPS agreement is discussed. The use of scenario trees is the core modelling approach used to represent the steps necessary for the hazard to occur. There is scope to elaborate scenario trees for commodity IRA so that the quantity of hazard at each step is assessed, which is crucial to the likelihood of establishment. The dependence between exposure and establishment suggests that they should fall within the same subcomponent. IRA undertaken for trade reasons must include an assessment of consequences to meet SPS criteria, but guidance is sparse. The integration of epidemiological and economic modelling may open a path for better methods. Matrices have been used in qualitative IRA to combine estimates of entry and exposure, and consequences with likelihood, but this approach has flaws and better methods are needed. OIE IRA standards and guidance indicate that the volume of trade should be taken into account, but offer no detail. Some published qualitative IRAs have assumed current levels and patterns of trade without specifying the volume of trade, which constrains the use of IRA to determine mitigation measures (to reduce risk to an acceptable level) and whether the principle of equivalence, fundamental to the SPS agreement, has been observed. It is questionable whether qualitative IRA can meet all the criteria set out in the SPS agreement. Nevertheless, scope exists to elaborate the current standards and guidance, so they better serve the principle of science-based decision-making. PMID:24237667

Peeler, E J; Reese, R A; Thrush, M A



APA's Learning Objectives for Research Methods and Statistics in Practice: A Multimethod Analysis  

ERIC Educational Resources Information Center

Research methods and statistics courses constitute a core undergraduate psychology requirement. We analyzed course syllabi and faculty self-reported coverage of both research methods and statistics course learning objectives to assess the concordance with APA's learning objectives (American Psychological Association, 2007). We obtained a sample of…

Tomcho, Thomas J.; Rice, Diana; Foels, Rob; Folmsbee, Leah; Vladescu, Jason; Lissman, Rachel; Matulewicz, Ryan; Bopp, Kara



Alternative method for wave propagation analysis within bounded linear media: conceptual and practical implications  

E-print Network

This paper uses an alternative approach to study the monochromatic plane wave propagation within dielectric and conductor linear media of plane-parallel-faces. This approach introduces the time-averaged Poynting vector modulus as field variable. The conceptual implications of this formalism are that the nonequivalence between the time-averaged Poynting vector and the squared-field amplitude modulus is naturally manifested as a consequence of interface effects. Also, two practical implications are considered: first, the exact transmittance is compared with that given by the Beer's Law, employed commonly in experiments. The departure among them can be significative for certain material parameter values. Second, when the exact reflectance is studied for negative permittivity slabs, it is show that the high reflectance can be diminished if a small amount of absorption is present.

Alberto Lencina; Beatriz Ruiz; Pablo Vaveliuk



Development and application to clinical practice of a validated HPLC method for the analysis of ?-glucocerebrosidase in Gaucher disease.  


The main objective of our study is to develop a simple, fast and reliable method for measuring ?-glucocerebrosidase activity in Gaucher patients leukocytes in clinical practice. This measurement may be a useful marker to drive dose selection and early clinical decision making of enzyme replacement therapy. We measure the enzyme activity by high-performance liquid chromatography with ultraviolet detection and 4-nitrophenyl-?-d-glucopyranoside as substrate. A cohort of eight Gaucher patients treated with enzyme replacement therapy and ten healthy controls were tested; median enzyme activity values was 20.57mU/ml (interquartile range 19.92-21.53mU/ml) in patients and mean was 24.73mU/ml (24.12-25.34mU/ml) in the reference group, which allowed the establishment of the normal range of ?-glucocerebrosidase activity. The proposed method for leukocytes glucocerebrosidase activity measuring is fast, easy to use, inexpensive and reliable. Furthermore, significant differences between both populations were observed (p=0.008). This suggests that discerning between patients and healthy individuals and providing an approach to enzyme dosage optimization is feasible. This method could be considered as a decision support tool for clinical monitoring. Our study is a first approach to in depth analysis of enzyme replacement therapy and optimization of dosing therapies. PMID:24447963

Colomer, E Gras; Gómez, M A Martínez; Alvarez, A González; Martí, M Climente; Moreno, P León; Zarzoso, M Fernández; Jiménez-Torres, N V



A practical method for assessing cadmium levels in soil using the DTPA extraction technique with graphite furnace analysis  

SciTech Connect

Using the DTPA extraction procedure and a graphite furnace atomic absorption spectrophotometer, a practical method for determining soil cadmium levels was developed. Furnace parameters, instrument parameters, solvent dilution factor, and solvent characteristics were determined using experimental field samples and standardized control samples. The DTPA extraction method gave reproducible results and removed approximately 20 to 60% of total soil cadmium. 14 refs., 5 tabs.

Bailey, V.L.; Grant, C.A.; Bailey, L.D. [Agriculture and Agri-Food Canada, Manitoba (Canada)] [and others



Insight into Evaluation Practice: A Content Analysis of Designs and Methods Used in Evaluation Studies Published in North American Evaluation-Focused Journals  

ERIC Educational Resources Information Center

To describe the recent practice of evaluation, specifically method and design choices, the authors performed a content analysis on 117 evaluation studies published in eight North American evaluation-focused journals for a 3-year period (2004-2006). The authors chose this time span because it follows the scientifically based research (SBR)…

Christie, Christina A.; Fleischer, Dreolin Nesbitt



Good practices for quantitative bias analysis.  


Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage more widespread use of bias analysis to estimate the potential magnitude and direction of biases, as well as the uncertainty in estimates potentially influenced by the biases. PMID:25080530

Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander



Doing Conversation Analysis: A Practical Guide.  

ERIC Educational Resources Information Center

Noting that conversation analysis (CA) has developed into one of the major methods of analyzing speech in the disciplines of communications, linguistics, anthropology and sociology, this book demonstrates in a practical way how to become a conversation analyst. As well as providing an overall introduction to the approach, it focuses on the…

ten Have, Paul


Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: practical guidelines for the crystalline sponge method  

PubMed Central

A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data, and will allow construction of chemically and physically sensible models for guest structural determination. PMID:25537388

Ramadhar, Timothy R.; Zheng, Shao-Liang; Chen, Yu-Sheng; Clardy, Jon



Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: practical guidelines for the crystalline sponge method.  


A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal-organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data, and will allow construction of chemically and physically sensible models for guest structural determination. PMID:25537388

Ramadhar, Timothy R; Zheng, Shao Liang; Chen, Yu Sheng; Clardy, Jon



Analysis of release kinetics of ocular therapeutics from drug releasing contact lenses: Best methods and practices to advance the field.  


Several methods have been proposed to achieve an extended and controlled release of ocular therapeutics via contact lenses; however, the experimental conditions used to study the drug release vary greatly and significantly influence the release kinetics. In this paper, we examine variations in the release conditions and their effect on the release of both hydrophilic and hydrophobic drugs (ketotifen fumarate, diclofenac sodium, timolol maleate and dexamethasone) from conventional hydrogel and silicone hydrogel lenses. Drug release was studied under different conditions, varying volume, mixing rates, and temperature. Volume had the biggest effect on the release profile, which ironically is the least consistent variable throughout the literature. When a small volume (2-30 mL) was used with no forced mixing and solvent exchange every 24 h, equilibrium was reached promptly much earlier than solvent exchange, significantly damping the drug release rate and artificially extending the release duration, leading to false conclusions. Using a large volume (200-400 mL) with a 30 rpm mixing rate and no solvent exchange, the release rate and total mass released was significantly increased. In general, the release performed in small volumes with no force mixing exhibited cumulative mass release amounts of 3-12 times less than the cumulative release amounts in large volumes with mixing. Increases in mixing rate and temperature resulted in relatively small increases of 1.4 and 1.2 times, respectively in fractional mass released. These results strongly demonstrate the necessity of proper and thorough analysis of release data to assure that equilibrium is not affecting release kinetics. This is paramount for comparison of various controlled drug release methods of therapeutic contact lenses, validation of the potential of lenses as an efficient and effective means of drug delivery, as well as increasing the likelihood of only the most promising methods reaching in vivo studies. PMID:24894544

Tieppo, Arianna; Boggs, Aarika C; Pourjavad, Payam; Byrne, Mark E



Assessment of management in general practice: validation of a practice visit method.  

PubMed Central

BACKGROUND: Practice management (PM) in general practice is as yet ill-defined; a systematic description of its domain, as well as a valid method to assess it, are necessary for research and assessment. AIM: To develop and validate a method to assess PM of general practitioners (GPs) and practices. METHOD: Relevant and potentially discriminating indicators were selected from a systematic framework of 2410 elements of PM to be used in an assessment method (VIP = visit instrument PM). The method was first tested in a pilot study and, after revision, was evaluated in order to select discriminating indicators and to determine validity of dimensions (factor and reliability analysis, linear regression). RESULTS: One hundred and ten GPs were assessed with the practice visit method using 249 indicators; 208 of these discriminated sufficiently at practice level or at GP level. Factor analysis resulted in 34 dimensions and in a taxonomy of PM. Dimensions and indicators showed marked variation between GPs and practices. Training practices scored higher on five dimensions; single-handed and dispensing practices scored lower on delegated tasks, but higher on accessibility and availability. CONCLUSION: A visit method to assess PM has been developed and its validity studied systematically. The taxonomy and dimensions of PM were in line with other classifications. Selection of a balanced number of useful and relevant indicators was nevertheless difficult. The dimensions could discriminate between groups of GPs and practices, establishing the value of the method for assessment. The VIP method could be an important contribution to the introduction of continuous quality improvement in the profession. PMID:10198481

van den Hombergh, P; Grol, R; van den Hoogen, H J; van den Bosch, W J



Breastfeeding practices: does method of delivery matter?  


Objective of this study was to assess the relationship between method of delivery and breastfeeding. Using data (2005-2006) from the longitudinal Infant Feeding Practices Study II (n = 3,026) we assessed the relationship between delivery method (spontaneous vaginal, induced vaginal, emergency cesarean, and planned cesarean) and breastfeeding: initiation, any breastfeeding at 4 weeks, any breastfeeding at 6 months, and overall duration. We used SAS software to analyze data using multivariable analyses adjusting for several confounders, including selected demographic characteristics, participants' pre-delivery breastfeeding intentions and attitude, and used event-history analysis to estimate breastfeeding duration by delivery method. We found no significant association between delivery method and breastfeeding initiation. In the fully adjusted models examining breastfeeding duration to 4 weeks with spontaneous vaginal delivery group as the reference, those with induced vaginal deliveries were significantly less likely to breastfeed [adjusted odds ratio (AOR) = 0.53; 95 % CI = 0.38-0.71]; and no significant relationship was observed for those who had planned or emergency cesarean deliveries. Again, compared with spontaneous vaginal delivery group, those with induced vaginal [AOR = 0.60; 96 % CI = 0.47-0.78] and emergency cesarean [AOR = 0.68; 96 % CI = 0.48-0.95] deliveries were significantly less likely to breastfeed at 6 months. Median breastfeeding duration was 45.2 weeks among women with spontaneous vaginal, 38.7 weeks among planned cesarean, 25.8 weeks among induced vaginal and 21.5 weeks among emergency cesarean deliveries. While no significant association was observed between delivery method and breastfeeding initiation; breastfeeding duration varied substantially with method of delivery, perhaps indicating a need for additional support for women with assisted deliveries. PMID:22926268

Ahluwalia, Indu B; Li, Ruowei; Morrow, Brian




E-print Network

of a case study. We consider survival times (e.g., time to recurrence of depression) from a clinical trial This paper is about model selection for clinical trials data. We present a modest case study to illustrate controlled clinical trial but the methods we present are applicable to many model selection problems


Evaluation of agricultural best-management practices in the Conestoga River headwaters, Pennsylvania; methods of data collection and analysis and description of study areas  

USGS Publications Warehouse

The U.S. Geological Survey is conducting a water quality study as part of the nationally implemented Rural Clean Water Program in the headwaters of the Conestoga River, Pennsylvania. The study, which began in 1982, was designed to determine the effect of agricultural best management practices on surface--and groundwater quality. The study was concentrated in four areas within the intensively farmed, carbonate rock terrane located predominately in Lancaster County, Pennsylvania. These areas were divided into three monitoring components: (1) a Regional study area (188 sq mi): (2) a Small Watershed study area (5.82 sq mi); and (3) two field site study areas, Field-Site 1 (22.1 acres) and Field 2 (47.5 acres). The type of water quality data and the methods of data collection and analysis are presented. The monitoring strategy and description of the study areas are discussed. The locations and descriptions for all data collection locations at the four study areas are provided. (USGS)

Chichester, Douglas C.



The Sherlock Holmes method in clinical practice.  


This article lists the integral elements of the Sherlock Holmes method, which is based on the intelligent collection of information through detailed observation, careful listening and thorough examination. The information thus obtained is analyzed to develop the main and alternative hypotheses, which are shaped during the deductive process until the key leading to the solution is revealed. The Holmes investigative method applied to clinical practice highlights the advisability of having physicians reason through and seek out the causes of the disease with the data obtained from acute observation, a detailed review of the medical history and careful physical examination. PMID:24457141

Sopeña, B



Visionlearning: Research Methods: The Practice of Science  

NSDL National Science Digital Library

This instructional module introduces four types of research methods: experimentation, description, comparison, and modeling. It was developed to help learners understand that the classic definition of the "scientific method" does not capture the dynamic nature of science investigation. As learners explore each methodology, they develop an understanding of why scientists use multiple methods to gather data and develop hypotheses. It is appropriate for introductory physics courses and for teachers seeking content support in research practices. Editor's Note: Secondary students often cling to the notion that scientific research follows a stock, standard "scientific method". They may be unaware of the differences between experimental research, correlative studies, observation, and computer-based modeling research. In this resource, they can glimpse each methodology in the context of a real study done by respected scientists. This resource is part of Visionlearning, an award-winning set of classroom-tested modules for science education.

Carpi, Anthony; Egger, Anne


Practical method for balancing airplane moments  

NASA Technical Reports Server (NTRS)

The present contribution is the sequel to a paper written by Messrs. R. Fuchs, L. Hopf, and H. Hamburger, and proposes to show that the methods therein contained can be practically utilized in computations. Furthermore, the calculations leading up to the diagram of moments for three airplanes, whose performance in war service gave reason for complaint, are analyzed. Finally, it is shown what conclusions can be drawn from the diagram of moments with regard to the defects in these planes and what steps may be taken to remedy them.

Hamburger, H



Systemic accident analysis: examining the gap between research and practice.  


The systems approach is arguably the dominant concept within accident analysis research. Viewing accidents as a result of uncontrolled system interactions, it forms the theoretical basis of various systemic accident analysis (SAA) models and methods. Despite the proposed benefits of SAA, such as an improved description of accident causation, evidence within the scientific literature suggests that these techniques are not being used in practice and that a research-practice gap exists. The aim of this study was to explore the issues stemming from research and practice which could hinder the awareness, adoption and usage of SAA. To achieve this, semi-structured interviews were conducted with 42 safety experts from ten countries and a variety of industries, including rail, aviation and maritime. This study suggests that the research-practice gap should be closed and efforts to bridge the gap should focus on ensuring that systemic methods meet the needs of practitioners and improving the communication of SAA research. PMID:23542136

Underwood, Peter; Waterson, Patrick



An Online Forum As a Qualitative Research Method: Practical Issues  

PubMed Central

Background Despite positive aspects of online forums as a qualitative research method, very little is known about practical issues involved in using online forums for data collection, especially for a qualitative research project. Objectives The purpose of this paper is to describe the practical issues that the researchers encountered in implementing an online forum as a qualitative component of a larger study on cancer pain experience. Method Throughout the study process, the research staff recorded issues ranged from minor technical problems to serious ethical dilemmas as they arose and wrote memos about them. The memos and written records of discussions were reviewed and analyzed using the content analysis suggested by Weber. Results Two practical issues related to credibility were identified: a high response and retention rate and automatic transcripts. An issue related to dependability was the participants’ easy forgetfulness. The issues related to confirmability were difficulties in theoretical saturation and unstandardized computer and Internet jargon. A security issue related to hacking attempts was noted as well. Discussion The analysis of these issues suggests several implications for future researchers who want to use online forums as a qualitative data collection method. PMID:16849979

Im, Eun-Ok; Chee, Wonshik



Achieving integration in mixed methods designs-principles and practices.  


Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs-exploratory sequential, explanatory sequential, and convergent-and through four advanced frameworks-multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

Fetters, Michael D; Curry, Leslie A; Creswell, John W



A Practical Method for Calibrating a Coaxial Noise Source with a Waveguide Standard  

Microsoft Academic Search

A practical method for calibrating a coaxial noise source with a waveguide standard has been developed by extending the adaptor-changing method reported before, and a practical equation to give its noise temperature, the measurement procedure and the error analysis are described.

Y. Kato; I. Yokoshima



A collection of research reporting, theoretical analysis, and practical applications in science education: Examining qualitative research methods, action research, educator-researcher partnerships, and constructivist learning theory  

NASA Astrophysics Data System (ADS)

Educator-researcher partnerships are increasingly being used to improve the teaching of science. Chapter 1 provides a summary of the literature concerning partnerships, and examines the justification of qualitative methods in studying these relationships. It also justifies the use of Participatory Action Research (PAR). Empirically-based studies of educator-researcher partnership relationships are rare despite investments in their implementation by the National Science Foundation (NSF) and others. Chapter 2 describes a qualitative research project in which participants in an NSF GK-12 fellowship program were studied using informal observations, focus groups, personal interviews, and journals to identify and characterize the cultural factors that influenced the relationships between the educators and researchers. These factors were organized into ten critical axes encompassing a range of attitudes, behaviors, or values defined by two stereotypical extremes. These axes were: (1) Task Dictates Context vs. Context Dictates Task; (2) Introspection vs. Extroversion; (3) Internal vs. External Source of Success; (4) Prior Planning vs. Implementation Flexibility; (5) Flexible vs. Rigid Time Sense; (6) Focused Time vs. Multi-tasking; (7) Specific Details vs. General Ideas; (8) Critical Feedback vs. Encouragement; (9) Short Procedural vs. Long Content Repetition; and (10) Methods vs. Outcomes are Well Defined. Another ten important stereotypical characteristics, which did not fit the structure of an axis, were identified and characterized. The educator stereotypes were: (1) Rapport/Empathy; (2) Like Kids; (3) People Management; (4) Communication Skills; and (5) Entertaining. The researcher stereotypes were: (1) Community Collaboration; (2) Focus Intensity; (3) Persistent; (4) Pattern Seekers; and (5) Curiosity/Skeptical. Chapter 3 summarizes the research presented in chapter 2 into a practical guide for participants and administrators of educator-researcher partnerships. Understanding how to identify and evaluate constructivist lessons is the first step in promoting and improving constructivism in teaching. Chapter 4 summarizes a theoretically-generated series of practical criteria that define constructivism: (1) Eliciting Prior Knowledge, (2) Creating Cognitive Dissonance, (3) Application of New Knowledge with Feedback, and (4) Reflection on Learning, or Metacognition. These criteria can be used by any practitioner to evaluate the level of constructivism used in a given lesson or activity.

Hartle, R. Todd



Practical Methods for Studying Collisional Breakup  

NASA Astrophysics Data System (ADS)

The quantum theory of three-body breakup in Coulomb systems, formulated in the early sixties, has formed the basis of a considerable body of theoretical analysis of low energy electron impact ionization. Although aspects of this theory have been incorporated into various perturbative and distorted-wave treatments, the formal theory has not provided a viable comupational approach to a first-principles treatment of ionization, due to the complicated nature of the boundary conditions for three-body breakup in Coulomb systems and the fact that they are only known in the far asymptotic region. Exterior complex scaling allows one to solve the Schrödinger equation without explicit imposition of asymptotic boundary conditions. This approach has produced the first triple differential cross sections for e-H ionization that are in complete agreement with absolute measurements[1]. In this talk, I will review the essential aspects of this approach and present new results on double differential cross sections for e-H ionization. I will also discuss some new methods for extracting dynamical information from numerically obtained wave functions that are more efficient than the flux operator approach we previously employed. These methods allow us to explore ionization in the threshold region and open the way to calculations on systems with more than two electrons, where new physical effects can be studied. [1] T. N. Rescigno, M. Baertschy, W. A. Isaacs and C. W. McCurdy, Science 286, 2474 (1999)

Rescigno, T. N.



Method of analysis and quality-assurance practices by the U. S. Geological Survey Organic Geochemistry Research Group; determination of four selected mosquito insecticides and a synergist in water using liquid-liquid extraction and gas chrom  

USGS Publications Warehouse

A method of analysis and quality-assurance practices were developed for the determination of four mosquito insecticides (malathion, metho-prene, phenothrin, and resmethrin) and one synergist (piperonyl butoxide) in water. The analytical method uses liquid-liquid extraction (LLE) and gas chromatography/mass spectrometry (GC/MS). Good precision and accuracy were demonstrated in reagent water, urban surface water, and ground water. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 50 nanograms per liter ranged from 68 to 171 percent, with standard deviations in concentrations of 27 nanograms per liter or less. The method detection limit for all compounds was 5.9 nanograms per liter or less for 247-milliliter samples. This method is valuable for acquiring information about the fate and transport of these mosquito insecticides and one synergist in water.

Zimmerman, L.R.; Strahan, A.P.; Thurman, E.M.



Airphoto analysis of erosion control practices  

NASA Technical Reports Server (NTRS)

The Universal Soil Loss Equation (USLE) is a widely accepted tool for erosion prediction and conservation planning. In this study, airphoto analysis of color and color infrared 70 mm photography at a scale of 1:60,000 was used to determine the erosion control practice factor in the USLE. Information about contour tillage, contour strip cropping, and grass waterways was obtained from aerial photography for Pheasant Branch Creek watershed in Dane County, Wisconsin.

Morgan, K. M.; Morris-Jones, D. R.; Lee, G. B.; Kiefer, R. W.



Towards Practical User Experience Evaluation Methods  

Microsoft Academic Search

In the last decade, User eXperience (UX) research in the academic community has produced a multitude of UX models and frameworks. These models address the key issues of UX: its subjective, highly situated and dynamic nature, as well as the pragmatic and hedonic factors leading to UX. At the same time, industry is adopting the UX term but the practices

Kaisa Väänänen-Vainio-Mattila; Virpi Roto; Marc Hassenzahl




PubMed Central

Stereological principles provide efficient and reliable tools for the determination of quantitative parameters of tissue structure on sections. Some principles which allow the estimation of volumetric ratios, surface areas, surface-to-volume ratios, thicknesses of tissue or cell sheets, and the number of structures are reviewed and presented in general form; means for their practical application in electron microscopy are outlined. The systematic and statistical errors involved in such measurements are discussed. PMID:5338131

Weibel, Ewald R.; Kistler, Gonzague S.; Scherle, Walter F.



Method of analysis at the U.S. Geological Survey California Water Science Center, Sacramento Laboratory - determination of haloacetic acid formation potential, method validation, and quality-control practices  

USGS Publications Warehouse

An analytical method for the determination of haloacetic acid formation potential of water samples has been developed by the U.S. Geological Survey California Water Science Center Sacramento Laboratory. The haloacetic acid formation potential is measured by dosing water samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine. The haloacetic acids formed are bromochloroacetic acid, bromodichloroacetic acid, dibromochloroacetic acid, dibromoacetic acid, dichloroacetic acid, monobromoacetic acid, monochloroacetic acid, tribromoacetic acid, and trichloroacetic acid. They are extracted, methylated, and then analyzed using a gas chromatograph equipped with an electron capture detector. Method validation experiments were performed to determine the method accuracy, precision, and detection limit for each of the compounds. Method detection limits for these nine haloacetic acids ranged from 0.11 to 0.45 microgram per liter. Quality-control practices include the use of blanks, quality-control samples, calibration verification standards, surrogate recovery, internal standard, matrix spikes, and duplicates.

Zazzi, Barbara C.; Crepeau, Kathryn L.; Fram, Miranda S.; Bergamaschi, Brian A.



The 5-Step Method: Principles and Practice  

ERIC Educational Resources Information Center

This article includes a description of the 5-Step Method. First, the origins and theoretical basis of the method are briefly described. This is followed by a discussion of the general principles that guide the delivery of the method. Each step is then described in more detail, including the content and focus of each of the five steps that include:…

Copello, Alex; Templeton, Lorna; Orford, Jim; Velleman, Richard




E-print Network

. OSTASZEWSKI, AND GRZEGORZ A. REMPALA Abstract Actuarial analysis can be viewed as the process of studying profitability and solvency of an insurance firm under a realistic and integrated model of key input ran- dom In modern analysis of the financial models of property- casualty companies the input variables can

Ostaszewski, Krzysztof M.


Standard practice for digital imaging and communication nondestructive evaluation (DICONDE) for computed radiography (CR) test methods  

E-print Network

1.1 This practice facilitates the interoperability of computed radiography (CR) imaging and data acquisition equipment by specifying image data transfer and archival storage methods in commonly accepted terms. This practice is intended to be used in conjunction with Practice E2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see, an international standard for image data acquisition, review, storage and archival storage. The goal of Practice E2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE results on any system conforming to the DICONDE standard. Toward that end, Practice E2339 provides a data dictionary and a set of information modules that are applicable to all NDE modalities. This practice supplements Practice E2339 by providing information objec...

American Society for Testing and Materials. Philadelphia



Pragmatism in practice: mixed methods research for physiotherapy.  


The purpose of this paper is to provide an argument for the place of mixed methods research across practice settings as an effective means of supporting evidence-based practice in physiotherapy. Physiotherapy practitioners use both qualitative and quantitative methods throughout the process of patient care-from history taking, assessment, and intervention to evaluation of outcomes. Research on practice paradigms demonstrates the importance of mixing qualitative and quantitative methods to achieve 'expert practice' that is concerned with optimizing outcomes and incorporating patient beliefs and values. Research paradigms that relate to this model of practice would integrate qualitative and quantitative types of knowledge and inquiry, while maintaining a prioritized focus on patient outcomes. Pragmatism is an emerging research paradigm where practical consequences and the effects of concepts and behaviors are vital components of meaning and truth. This research paradigm supports the simultaneous use of qualitative and quantitative methods of inquiry to generate evidence to support best practice. This paper demonstrates that mixed methods research with a pragmatist view provides evidence that embraces and addresses the multiple practice concerns of practitioners better than either qualitative or quantitative research approaches in isolation. PMID:20649500

Shaw, James A; Connelly, Denise M; Zecevic, Aleksandra A



A practical method for sensor absolute calibration.  


This paper describes a method of performing sensor calibrations using an NBS standard of spectral irradiance. The method shown, among others, was used for calibration of the Mariner IV Canopus sensor. Agreement of inflight response to preflight calibrations performed by this technique has been found to be well within 10%. PMID:20048890

Meisenholder, G W



The "Anchor" Method: Principle and Practice.  

ERIC Educational Resources Information Center

This report discusses the "anchor" language learning method that is based upon derivation rather than construction, using Italian as an example of a language to be learned. This method borrows from the natural process of language learning as it asks the student to remember whole expressions that serve as vehicles for learning both words and rules,…

Selgin, Paul


Critical practice in nursing care: analysis, action and reflexivity.  


This article examines critical practice and its underlying principles: analysis, action and reflexivity. Critical analysis involves the examination of knowledge that underpins practice. Critical action requires nurses to assess their skills and identify potential gaps in need of professional development. Critical reflexivity is personal analysis that involves challenging personal beliefs and assumptions to improve professional and personal practice. Incorporating these aspects into nursing can benefit nursing practice. PMID:16786927

Timmins, F


METHODS PAPER Addressing Practical Challenges of Low Friction Coefficient  

E-print Network

METHODS PAPER Addressing Practical Challenges of Low Friction Coefficient Measurements D. L. Burris 2009 Ã? Springer Science+Business Media, LLC 2009 Abstract A friction coefficient is defined calculation, there are practical challenges that make low values of friction coefficient difficult

Sawyer, Wallace


Optimizing Distributed Practice: Theoretical Analysis and Practical Implications  

ERIC Educational Resources Information Center

More than a century of research shows that increasing the gap between study episodes using the same material can enhance retention, yet little is known about how this so-called distributed practice effect unfolds over nontrivial periods. In two three-session laboratory studies, we examined the effects of gap on retention of foreign vocabulary,…

Cepeda, Nicholas J.; Coburn, Noriko; Rohrer, Doug; Wixted, John T.; Mozer, Michael C,; Pashler, Harold



Practical Application of Second Law Costing Methods  

E-print Network

or availability. The methods for composing exergy cost flow diagrams will be explained. The results will be shown for several plants - electric-power, co-generation, coal-gasification, and others. The application of such results will be shown for cost...

Wepfer, W. J.; Gaggioli, R. A.



Deepen the GIS spatial analysis theory studying through the gradual process of practice  

NASA Astrophysics Data System (ADS)

Spatial analysis is the key content of GIS basic theory course. In this paper, the importance of practice teaching for GIS spatial analysis theory studying and its implementation method are discussed combined with practice teaching arrangement of spatial analysis in the course "GIS theory and practice" based on the basic principle of procedural teaching theory and its teaching model. In addition, the concrete gradual practice process is mentioned in four aspects. By this way, the GIS spatial analysis theory studying can be deepened and the cultivation of students' comprehensive ability of Geography Science can be strengthened.

Yi, Y. G.; Liu, H. P.; Liu, X. P.



Standard practice for digital imaging and communication in nondestructive evaluation (DICONDE) for ultrasonic test methods  

E-print Network

1.1 This practice facilitates the interoperability of ultrasonic imaging equipment by specifying image data transfer and archival storage methods in commonly accepted terms. This document is intended to be used in conjunction with Practice E 2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E 2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see, an international standard for image data acquisition, review, transfer and archival storage. The goal of Practice E 2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE test results on any system conforming to the DICONDE standard. Toward that end, Practice E 2339 provides a data dictionary and set of information modules that are applicable to all NDE modalities. This practice supplements Practice E 2339 by providing information object definitions, information ...

American Society for Testing and Materials. Philadelphia



Introducing Formal Specification Methods in Industrial Practice \\Lambda  

E-print Network

Introducing Formal Specification Methods in Industrial Practice \\Lambda Luciano Baresi, Alessandro Formal specification methods are not often applied in in­ dustrial projects, despite their advantages and the difficulties of their use are main causes of the limited success of formal specification methods. Approaches

Orso, Alessandro "Alex"


Multi-criteria decision analysis: Limitations, pitfalls, and practical difficulties  

SciTech Connect

The 2002 Winter Olympics women's figure skating competition is used as a case study to illustrate some of the limitations, pitfalls, and practical difficulties of Multi-Criteria Decision Analysis (MCDA). The paper compares several widely used models for synthesizing the multiple attributes into a single aggregate value. The various MCDA models can provide conflicting rankings of the alternatives for a common set of information even under states of certainty. Analysts involved in MCDA need to deal with the following challenging tasks: (1) selecting an appropriate analysis method, and (2) properly interpreting the results. An additional trap is the availability of software tools that implement specific MCDA models that can beguile the user with quantitative scores. These conclusions are independent of the decision domain and they should help foster better MCDA practices in many fields including systems engineering trade studies.

Kujawski, Edouard



Practical aspects of spatially high accurate methods  

NASA Technical Reports Server (NTRS)

The computational qualities of high order spatially accurate methods for the finite volume solution of the Euler equations are presented. Two dimensional essentially non-oscillatory (ENO), k-exact, and 'dimension by dimension' ENO reconstruction operators are discussed and compared in terms of reconstruction and solution accuracy, computational cost and oscillatory behavior in supersonic flows with shocks. Inherent steady state convergence difficulties are demonstrated for adaptive stencil algorithms. An exact solution to the heat equation is used to determine reconstruction error, and the computational intensity is reflected in operation counts. Standard MUSCL differencing is included for comparison. Numerical experiments presented include the Ringleb flow for numerical accuracy and a shock reflection problem. A vortex-shock interaction demonstrates the ability of the ENO scheme to excel in simulating unsteady high-frequency flow physics.

Godfrey, Andrew G.; Mitchell, Curtis R.; Walters, Robert W.



Aircraft accidents : method of analysis  

NASA Technical Reports Server (NTRS)

This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)



Methods of analysis and quality-assurance practices of the U.S. Geological Survey organic laboratory, Sacramento, California; determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry  

USGS Publications Warehouse

Analytical method and quality-assurance practices were developed for a study of the fate and transport of pesticides in the Sacramento-San Joaquin Delta and the Sacramento and San Joaquin River. Water samples were filtered to remove suspended parti- culate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide, and the pesticides were eluted with three 2-milliliter aliquots of hexane:diethyl ether (1:1). The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for analytes determined per 1,500-milliliter samples ranged from 0.006 to 0.047 microgram per liter. Recoveries ranged from 47 to 89 percent for 12 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.05 and 0.26 microgram per liter. The method was modified to improve the pesticide recovery by reducing the sample volume to 1,000 milliliters. Internal standards were added to improve quantitative precision and accuracy. The analysis also was expanded to include a total of 21 pesticides. The method detection limits for 1,000-milliliter samples ranged from 0.022 to 0.129 microgram per liter. Recoveries ranged from 38 to 128 percent for 21 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.10 and 0.75 microgram per liter.

Crepeau, Kathryn L.; Domagalski, Joseph L.; Kuivila, Kathryn M.



Applying community-oriented primary care methods in British general practice: a case study.  

PubMed Central

BACKGROUND: The '75 and over' assessments built into the 1990 contract for general practice have failed to enthuse primary care teams or make a significant impact on the health of older people. Alternative methods for improving the health of older people living at home are being sought. AIM: To test the feasibility of applying community-oriented primary care methodology to a relatively deprived sub-population of older people in a relatively deprived area. DESIGN OF STUDY: A combination of developmental and triangulation approaches to data analysis. SETTING: Four general practices in an inner London borough. METHOD: A community-oriented primary care approach was used to initiate innovative care for older people, supported financially by the health authority and practically by primary care academics. RESULTS: All four practices identified problems needing attention in the older population, developed different projects focused on particular needs among older people, and tested them in practice. Patient and public involvement were central to the design and implementation processes in only one practice. Innovations were sustained in only one practice, but some were adopted by a primary care group and others extended to a wider group of practices by the health authority. CONCLUSION: A modified community-oriented primary care approach can be used in British general practice, and changes can be promoted that are perceived as valuable by planning bodies. However, this methodology may have more impact at primary care trust level than at practice level. PMID:12171223

Iliffe, Steve; Lenihan, Penny; Wallace, Paul; Drennan, Vari; Blanchard, Martin; Harris, Andrew



A Practical Method for Measuring Macular Pigment Optical Density  

Microsoft Academic Search

PURPOSE. Increasing evidence indicates that the macular pigments (MP) protect the central retina and may retard macular disease. For that reason, a practical method for measuring MP that does not require elaborate optics and can be applied to diverse populations by operators with a modest amount of experience was developed and validated. METHODS. A small tabletop device based on light-emitting

Billy R. Wooten; Billy R. Hammond; Richard I. Land; D. Max Snodderly



A Practice-Based Analysis of an Online Strategy Game  

NASA Astrophysics Data System (ADS)

In this paper, we will analyze a massively multiplayer online game in an attempt to identify the elements of practice that enable social interaction and cooperation within the game’s virtual world. Communities of Practice and Activity Theory offer the theoretical lens for identifying and understanding what constitutes practice within the community and how such practice is manifest and transmitted during game play. Our analysis suggests that in contrast to prevalent perceptions of practice as being textually mediated, in virtual settings it is framed as much in social interactions as in processes, artifacts and the tools constituting the ‘linguistic’ domain of the game or the practice the gaming community is about.

Milolidakis, Giannis; Kimble, Chris; Akoumianakis, Demosthenes


Between practice and theory: Melanie Klein, Anna Freud and the development of child analysis.  


An examination of the early history of child analysis in the writings of Melanie Klein and Anna Freud reveals how two different and opposing approaches to child analysis arose at the same time. The two methods of child analysis are rooted in a differential emphasis on psychoanalytic theory and practice. The Kleinian method derives from the application of technique while the Anna Freudian method is driven by theory. Furthermore, by holding to the Freudian theory of child development Anna Freud was forced to limit the scope of child analysis, while Klein's application of Freudian practice has led to new discoveries about the development of the infant psyche. PMID:8642183

Donaldson, G



Patients’ experiences of the choice of GP practice pilot, 2012/2013: a mixed methods evaluation  

PubMed Central

Objectives To investigate patients’ experiences of the choice of general practitioner (GP) practice pilot. Design Mixed-method, cross-sectional study. Setting Patients in the UK National Health Service (NHS) register with a general practice responsible for their primary medical care and practices set geographic boundaries. In 2012/2013, 43 volunteer general practices in four English NHS primary care trusts (PCTs) piloted a scheme allowing patients living outside practice boundaries to register as an out of area patient or be seen as a day patient. Participants Analysis of routine data for 1108 out of area registered patients and 250 day patients; postal survey of out of area registered (315/886, 36%) and day (64/188, 34%) patients over 18?years of age, with a UK mailing address; comparison with General Practice Patient Survey (GPPS); semistructured interviews with 24 pilot patients. Results Pilot patients were younger and more likely to be working than non-pilot patients at the same practices and reported generally more or at least as positive experiences than patients registered at the same practices, practices in the same PCT and nationally, despite belonging to subgroups of the population who typically report poorer than average experiences. Out of area patients who joined a pilot practice did so: after moving house and not wanting to change practice (26.2%); for convenience (32.6%); as newcomers to an area who selected a practice although they lived outside its boundary (23.6%); because of dissatisfaction with their previous practice (13.9%). Day patients attended primarily on grounds of convenience (68.8%); 51.6% of the day patient visits were for acute infections, most commonly upper respiratory infections (20.4%). Sixty-six per cent of day patients received a prescription during their visit. Conclusions Though the 12-month pilot was too brief to identify all costs and benefits, the scheme provided a positive experience for participating patients and practices. PMID:25667149

Tan, Stefanie; Erens, Bob; Wright, Michael; Mays, Nicholas



A Practical Method for Cable Failure Rate Modeling  

Microsoft Academic Search

As underground cables continue to increase in age, most utilities are experiencing an increase in underground cable failures. Since cable replacement programs are expensive, it is important to understand the impact that age and other cable characteristics have on cable failure rates. This paper presents a practical method to model cable failure rates categorized by cable features. It addresses the

Yujia Zhou; Richard E. Brown



[Embryo vitrification: French clinical practice analysis for BLEFCO].  


Frozen thawed embryo transfer is currently an important part of present-day assisted reproductive technology (ART) aiming at increasing the clinical pregnancy rate per oocyte retrieval. Although slow freezing method has been the reference during 2 decades, the recent years witnessed an expansion of ultrarapid cryopreservation method named vitrification. Recently in France, vitrification has been authorized for cryopreserving human embryos. Therefore BLEFCO consortium decides to perform a descriptive study through questionnaires to evaluate the state of vitrification in the French clinical practice. Questionnaires were addressed to the 105 French centres of reproductive biology and 60 were fully completed. Data analysis revealed that embryo survival rate as well as, clinical pregnancy rate were increased after vitrification technology when compared to slow freezing procedure. Overall, these preliminary data suggest that vitrification may improve ART outcomes through an increasing of the cumulative pregnancy rate per oocyte retrieval. PMID:23962680

Hesters, L; Achour-Frydman, N; Mandelbaum, J; Levy, R



Practical Nursing. Ohio's Competency Analysis Profile.  

ERIC Educational Resources Information Center

Developed through a modified DACUM (Developing a Curriculum) process involving business, industry, labor, and community agency representatives in Ohio, this document is a comprehensive and verified employer competency profile for practical nursing. The list contains units (with and without subunits), competencies, and competency builders that…

Ohio State Univ., Columbus. Vocational Instructional Materials Lab.


A practical method of estimating energy expenditure during tennis play.  


This study aimed to develop a practical method of estimating energy expenditure (EE) during tennis. Twenty-four elite female tennis players first completed a tennis-specific graded test in which five different Intensity levels were applied randomly. Each intensity level was intended to simulate a "game" of singles tennis and comprised six 14 s periods of activity alternated with 20 s of active rest. Oxygen consumption (VO2) and heart rate (HR) were measured continuously and each player's rate of perceived exertion (RPE) was recorded at the end of each intensity level. Rate of energy expenditure (EE(VO2)) during the test was calculated using the sum of VO2 during play and the 'O2 debt' during recovery, divided by the duration of the activity. There were significant individual linear relationships between EE(VO2) and RPE, EE(VO2) and HR (r > or = 0.89 & r > or = 0.93; p < 0.05). On a second occasion, six players completed a 60-min singles tennis match during which VO2, HR and RPE were recorded; EE(VO2) was compared with EE predicted from the previously derived RPE and HR regression equations. Analysis found that EE(VO2) was overestimated by EE(RPE) (92 +/- 76 kJ x h(-1)) and EE(HR) (435 +/- 678 kJ x h(-1)), but the error of estimation for EE(RPE) (t = -3.01; p = 0.03) was less than 5% whereas for EE(HR) such error was 20.7%. The results of the study show that RPE can be used to estimate the energetic cost of playing tennis. PMID:12801209

Novas, A M P; Rowbottom, D G; Jenkins, D G



A philosophical analysis of evidence-based practice in mental health nursing.  


Mental health nurses need to be aware that their knowledge base does not exist in isolation from other cultural practices. They/I/we must become more willing to engage in theoretical problem solving that directly affects clinical practice issues such as the introduction of evidence-based practice. Critical discussion of evidence-based practice should be informed by the complex issues that permeate all our socio-cultural and linguistic practices. This paper examines some of the major philosophical problems in the debate over the use of evidence-based practice in mental health nursing using both Foucault's formulation of discourse analysis and Derrida's construal of deconstruction. The conclusion reached is that postmodern philosophy offers a way to rid nursing of incessant naiive attacks on either quantitative or qualitative research methods which underpin the debate over evidence-based practice in mental health nursing. PMID:11493288

Lines, K



A Practical Method of Constructing Quantum Combinational Logic Circuits  

E-print Network

We describe a practical method of constructing quantum combinational logic circuits with basic quantum logic gates such as NOT and general $n$-bit Toffoli gates. This method is useful to find the quantum circuits for evaluating logic functions in the form most appropriate for implementation on a given quantum computer. The rules to get the most efficient circuit are utilized best with the aid of a Karnaugh map. It is explained which rules of using a Karnaugh map are changed due to the difference between the quantum and classical logic circuits.

Jae-Seung Lee; Yongwook Chung; Jaehyun Kim; Soonchil Lee



Landscape analysis: Theoretical considerations and practical needs  

USGS Publications Warehouse

Numerous systems of land classification have been proposed. Most have led directly to or have been driven by an author's philosophy of earth-forming processes. However, the practical need of classifying land for planning and management purposes requires that a system lead to predictions of the results of management activities. We propose a landscape classification system composed of 11 units, from realm (a continental mass) to feature (a splash impression). The classification concerns physical aspects rather than economic or social factors; and aims to merge land inventory with dynamic processes. Landscape units are organized using a hierarchical system so that information may be assembled and communicated at different levels of scale and abstraction. Our classification uses a geomorphic systems approach that emphasizes the geologic-geomorphic attributes of the units. Realm, major division, province, and section are formulated by subdividing large units into smaller ones. For the larger units we have followed Fenneman's delineations, which are well established in the North American literature. Areas and districts are aggregated into regions and regions into sections. Units smaller than areas have, in practice, been subdivided into zones and smaller units if required. We developed the theoretical framework embodied in this classification from practical applications aimed at land use planning and land management in Maryland (eastern Piedmont Province near Baltimore) and Utah (eastern Uinta Mountains). ?? 1991 Springer-Verlag New York Inc.

Godfrey, A.E.; Cleaves, E.T.



Practical analysis of welding processes using finite element analysis.  

SciTech Connect

With advances in commercially available finite element software and computational capability, engineers can now model large-scale problems in mechanics, heat transfer, fluid flow, and electromagnetics as never before. With these enhancements in capability, it is increasingly tempting to include the fundamental process physics to help achieve greater accuracy (Refs. 1-7). While this goal is laudable, it adds complication and drives up cost and computational requirements. Practical analysis of welding relies on simplified user inputs to derive important relativistic trends in desired outputs such as residual stress or distortion due to changes in inputs like voltage, current, and travel speed. Welding is a complex three-dimensional phenomenon. The question becomes how much modeling detail is needed to accurately predict relative trends in distortion, residual stress, or weld cracking? In this work, a HAZ (Heat Affected Zone) weld-cracking problem was analyzed to rank two different welding cycles (weld speed varied) in terms of crack susceptibility. Figure 1 shows an aerospace casting GTA welded to a wrought skirt. The essentials of part geometry, welding process, and tooling were suitably captured lo model the strain excursion in the HAZ over a crack-susceptible temperature range, and the weld cycles were suitably ranked. The main contribution of this work is the demonstration of a practical methodology by which engineering solutions to engineering problems may be obtained through weld modeling when time and resources are extremely limited. Typically, welding analysis suffers with the following unknowns: material properties over entire temperature range, the heat-input source term, and environmental effects. Material properties of interest are conductivity, specific heat, latent heat, modulus, Poisson's ratio, yield strength, ultimate strength, and possible rate dependencies. Boundary conditions are conduction into fixturing, radiation and convection to the environment, and any mechanical constraint. If conductivity, for example, is only known at a few temperatures it can be linearly extrapolated from the highest known temperature to the liquidus temperature. Over the liquidus to solidus temperature the conductivity is linearly increased by a factor of three to account for the enhanced heat transfer due to convection in the weld pool. Above the liquidus it is kept constant. Figure 2 shows an example of this type of approximation. Other thermal and mechanical properties and boundary conditions can be similarly approximated, using known physical material characteristics when possible. Sensitivity analysis can show that many assumptions have a small effect on the final outcome of the analysis. In the example presented in this work, simplified analysis procedures were used to model this process to understand why one set of parameters is superior to the other. From Lin (Ref. 8), mechanical strain is expected to drive HAZ cracking. Figure 3 shows a plot of principal tensile mechanical strain versus temperature during the welding process. By looking at the magnitudes of the tensile mechanical strain in the material's Brittle Temperature Region (BTR), it can be seen that on a relative basis the faster travel speed process that causes cracking results in about three times the strain in the temperature range of the BTR. In this work, a series of simplifying assumptions were used in order to quickly and accurately model a real welding process to respond to an immediate manufacturing need. The analysis showed that the driver for HAZ cracking, the mechanical strain in the BTR, was significantly higher in the process that caused cracking versus the process that did not. The main emphasis of the analysis was to determine whether there was a mechanical reason whether the improved weld parameters would consistently produce an acceptable weld, The prediction of the mechanical strain magnitudes confirms the better process.

Cowles, J. H. (John H.); Dave, V. R. (Vivek R.); Hartman, D. A. (Daniel A.)



Regulating forest practices in Texas: a problem analysis  

E-print Network

REGULATING FOREST PRACTICES IN TEXAS: A PROBLEM ANALYSIS A Thesis by ALAN DALE DREESEN Submitted to the Graduate College of Texas A&M University in partial fulfillment of the requirement for the degree of MASTER OF SCIENCE August 1977 Ma...]or Subject: Forestry REGULATING FOREST PRACTICES IN TEXAS: A PROBLEM ANALYSIS A Thesis by ALAN DALE DREESEN Approved as to style and content by: (Chair of Committee) ( ead of Depa t nt) (Member) (Member) August 1977 ABSTRACT Regulating Forest...

Dreesen, Alan D



Clinical simulation using deliberate practice in nursing education: a Wilsonian concept analysis.  


Effective use of simulation is dependent on a complete understanding of simulation's central conceptual elements. Deliberate practice, a constituent of Ericsson's theory of expertise, has been identified as a central concept in effective simulation learning. Deliberate practice is compatible with simulation frameworks already being suggested for use in nursing education. This paper uses Wilson's Method of concept analysis for the purpose of exploring the concept of deliberate practice in the context of clinical simulation in nursing education. Nursing education should move forward in a manner that reflects best practice in nursing education. PMID:24120521

Chee, Jennifer



[Towards understanding human ecology in nursing practice: a concept analysis].  


Human ecology is an umbrella concept encompassing several social, physical, and cultural elements existing in the individual's external environment. The pragmatic utility method was used to analyze the "human ecology" concept in order to ascertain the conceptual fit with nursing epistemology and to promote its use by nurses in clinical practice. Relevant articles for the review were retrieved from the MEDLINE, CINAHL, PsycINFO, and CSA databases using the terms "human ecology," "environment," "nursing," and "ecology." Data analysis revealed that human ecology is perceived as a theoretical perspective designating a complex, multilayered, and multidimensional system, one that comprises individuals and their reciprocal interactions with their global environments and the subsequent impact of these interactions upon their health. Human ecology preconditions include the individuals, their environments, and their transactions. Attributes of this concept encompass the characteristics of an open system (e.g., interdependence, reciprocal). PMID:20608260

Huynh, Truc; Alderson, Marie



Practice and methods of contraception among Saudi women in Riyadh.  


The use of contraceptives can have an impact on better spacing between children, better child care, improvement of children's health and preservation of the mother's health. In this study 2675 Saudi women attending a gynaecology out-patient clinic were interviewed about their contraceptive practices. The majority of the women (56.0%) were using or had used some form of contraceptive. Oral contraceptives were the most common method; 94.8% of the 1497 women who practised contraception were using or had used this form of contraception. Sterilization accounted for 0.9% of contraceptive practices, while the intrauterine device was a more common form of contraceptive among the more educated women. PMID:3391353

Jabbar, F A; Wong, S S; Al-Meshari, A A



A practical gait analysis system using gyroscopes  

Microsoft Academic Search

This study investigated the possibility of using uni-axial gyroscopes to develop a simple portable gait analysis system. Gyroscopes were attached on the skin surface of the shank and thigh segments and the angular velocity for each segment was recorded in each segment. Segment inclinations and knee angle were derived from segment angular velocities. The angular signals from a motion analysis

Kaiyu Tong; Malcolm H Granat



Practical Inexact Proximal Quasi-Newton Method with Global ...  

E-print Network

Mar 14, 2014 ... gradient methods, which includes analysis of method based on ... The work of this author is partially supported by NSF Grants DMS ... feature selection is desirable, such as sparse logistic regression [29, 30, 24], .... new iteration is based on sufficient decrease condition (much like in trust .... Find Hk = Gk + 1.



Tetrad Analysis: A Practical Demonstration Using Simple Models.  

ERIC Educational Resources Information Center

Uses simple models to illustrate the principles of this genetic method of mapping gene loci. Stresses that this system enables a practical approach to be used with students who experience difficulty in understanding the concepts involved. (CW)

Gow, Mary M.; Nicholl, Desmond S. T.



A practical method to evaluate radiofrequency exposure of mast workers.  


Assessment of occupational exposure to radiofrequency (RF) fields in telecommunication transmitter masts is a challenging task. For conventional field strength measurements using manually operated instruments, it is difficult to document the locations of measurements while climbing up a mast. Logging RF dosemeters worn by the workers, on the other hand, do not give any information about the location of the exposure. In this study, a practical method was developed and applied to assess mast workers' exposure to RF fields and the corresponding location. This method uses a logging dosemeter for personal RF exposure evaluation and two logging barometers to determine the corresponding height of the worker's position on the mast. The procedure is not intended to be used for compliance assessments, but to indicate locations where stricter assessments are needed. The applicability of the method is demonstrated by making measurements in a TV and radio transmitting mast. PMID:19054796

Alanko, Tommi; Hietanen, Maila



Practical aspects of genome-wide association interaction analysis.  


Large-scale epistasis studies can give new clues to system-level genetic mechanisms and a better understanding of the underlying biology of human complex disease traits. Though many novel methods have been proposed to carry out such studies, so far only a few of them have demonstrated replicable results. Here, we propose a minimal protocol for genome-wide association interaction (GWAI) analysis to identify gene-gene interactions from large-scale genomic data. The different steps of the developed protocol are discussed and motivated, and encompass interaction screening in a hypothesis-free and hypothesis-driven manner. In particular, we examine a wide range of aspects related to epistasis discovery in the context of complex traits in humans, hereby giving practical recommendations for data quality control, variant selection or prioritization strategies and analytic tools, replication and meta-analysis, biological validation of statistical findings and other related aspects. The minimal protocol provides guidelines and attention points for anyone involved in GWAI analysis and aims to enhance the biological relevance of GWAI findings. At the same time, the protocol improves a better assessment of strengths and weaknesses of published GWAI methodologies. PMID:25164382

Gusareva, Elena S; Van Steen, Kristel



Analysis methods for airborne radioactivity.  

E-print Network

??High-resolution gamma-ray spectrometry is an analysis method well suitable for monitoring airborne radioactivity. Many of the natural radionuclides and a majority of anthropogenic nuclides are… (more)

Ala-Heikkilä, Jarmo J



New directions on agile methods: a comparative analysis  

Microsoft Academic Search

Agile software development methods have caught the attention of software engineers and researchers worldwide. Scientific research is yet scarce. This paper reports results from a study, which aims to organize, analyze and make sense out of the dispersed field of agile software development methods. The comparative analysis is performed using the method's life-cycle coverage, project management support, type of practical

Pekka Abrahamsson; Juhani Warsta; Mikko T. Siponen; Jussi Ronkainen



MAD Skills: New Analysis Practices for Big Data  

Microsoft Academic Search

As massive data acquisition and storage becomes increas- ingly aordable, a wide variety of enterprises are employing statisticians to engage in sophisticated data analysis. In this paper we highlight the emerging practice of Magnetic, Ag- ile, Deep (MAD) data analysis as a radical departure from traditional Enterprise Data Warehouses and Business Intel- ligence. We present our design philosophy, techniques and

Jeffrey Cohen; Brian Dolan; Mark Dunlap; Joseph M. Hellerstein; Caleb Welton



Skill analysis part 2: evaluating a practice skill.  


This is the second of three articles exploring skill analysis, assisting readers to evaluate a practice skill of their choice. Sometimes evaluations are made against external reference points, the competencies of the registered nurse or a job description for a post eagerly sought after; sometimes they are made with reference to aspirations--an ideal of the skill in use that the nurse and colleagues admire. Nurses may be understandably anxious about the evaluation of practice skills, as they work in a performance-orientated world where they are judged on whether their practice is competent, safe, ethical, cost effective and efficient. Nonetheless, understanding the strengths and weaknesses of a chosen practice skill is central to practice development. If the skill is to be affirmed, improved or adjusted, it is necessary to evaluate the skill in use. PMID:22272540

Price, Bob


Comparison of four teaching methods on Evidence-based Practice skills of postgraduate nursing students.  


The aim of this study was to compare four teaching methods on the evidence-based practice knowledge and skills of postgraduate nursing students. Students enrolled in the Evidence-based Nursing (EBN) unit in Australia and Hong Kong in 2010 and 2011 received education via either the standard distance teaching method, computer laboratory teaching method, Evidence-based Practice-Digital Video Disc (EBP-DVD) teaching method or the didactic classroom teaching method. Evidence-based Practice (EBP) knowledge and skills were evaluated using student assignments that comprised validated instruments. One-way analysis of covariance was implemented to assess group differences on outcomes after controlling for the effects of age and grade point average (GPA). Data were obtained from 187 students. The crude mean score among students receiving the standard+DVD method of instruction was higher for developing a precise clinical question (8.1±0.8) and identifying the level of evidence (4.6±0.7) compared to those receiving other teaching methods. These differences were statistically significant after controlling for age and grade point average. Significant improvement in cognitive and technical EBP skills can be achieved for postgraduate nursing students by integrating a DVD as part of the EBP teaching resources. The EBP-DVD is an easy teaching method to improve student learning outcomes and ensure that external students receive equivalent and quality learning experiences. PMID:23107585

Fernandez, Ritin S; Tran, Duong Thuy; Ramjan, Lucie; Ho, Carey; Gill, Betty



International child care practices study: methods and study population.  


The study set out to document child care practices in as many different countries and cultures as possible with the aim of providing baseline child care data and stimulating new hypotheses to explain persisting differences in sudden infant death syndrome (SIDS) rates between countries. The protocol, piloted in four countries in 1992, was distributed to 80 potential centres in 1995. Data from 19 centres were received. This paper describes the demographic characteristics of the data from the different centres. Comparison showed significant differences for a number of variables including mean age of completion of the study, response rate, mean gestation, mean birth weight, method of delivery and incidence of admission to neonatal intensive care units. High caesarean section rates identified in the Chinese samples (44 and 40%) were unexpected and have important public health implications. This finding warrants further study but may be related to China's one child policy. We consider that international comparison of child care practice is possible using standardised data collection methods that also allow some individual variation according to local circumstances. However, in view of the heterogeneity of the samples, it will be important to avoid over-interpreting differences identified and to view any differences within the qualitative context of each individual sample. Provided there is acknowledgement of limitations, such ecological studies have potential to produce useful information especially for hypothesis generation. PMID:10390090

Nelson, E A; Taylor, B J



Standard practice for digital imaging and communication in nondestructive evaluation (DICONDE) for digital radiographic (DR) test methods  

E-print Network

1.1 This practice facilitates the interoperability of digital X-ray imaging equipment by specifying image data transfer and archival methods in commonly accepted terms. This document is intended to be used in conjunction with Practice E2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see, an international standard for image data acquisition, review, storage and archival storage. The goal of Practice E2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE results on any system conforming to the DICONDE standard. Toward that end, Practice E2339 provides a data dictionary and a set of information modules that are applicable to all NDE modalities. This practice supplements Practice E2339 by providing information object definitions, information modules and a ...

American Society for Testing and Materials. Philadelphia



Standard practice for digital imaging and communication in nondestructive evaluation (DICONDE) for X-ray computed tomography (CT) test methods  

E-print Network

1.1 This practice facilitates the interoperability of X-ray computed tomography (CT) imaging equipment by specifying image data transfer and archival storage methods in commonly accepted terms. This document is intended to be used in conjunction with Practice E2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see, an international standard for image data acquisition, review, storage and archival storage. The goal of Practice E2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE test results on any system conforming to the DICONDE standard. Toward that end, Practice E2339 provides a data dictionary and a set of information modules that are applicable to all NDE modalities. This practice supplements Practice E2339 by providing information object definitio...

American Society for Testing and Materials. Philadelphia



Qualitative multi-attribute decision method Theory and practice  

E-print Network

and exploitation of DEX models · Software · Applications #12;What is DEX? Multi-Attribute Model evaluation Decision and utility functions · problem decomposition and structuring · evaluation and analysis of decision Machine Learning Fuzzy sets · verbal measures · fuzzy operators #12;#12;DEX Method for qualitative multi

Bohanec, Marko


Practical implementation of nonlinear time series methods: The TISEAN package  

Microsoft Academic Search

We describe the implementation of methods of nonlinear time series analysis which are based on the paradigm of deterministic chaos. A variety of algorithms for data representation, prediction, noise reduction, dimension and Lyapunov estimation, and nonlinearity testing are discussed with particular emphasis on issues of implementation and choice of parameters. Computer programs that implement the resulting strategies are publicly available

Rainer Hegger; Holger Kantz; Thomas Schreiber



An analysis of remanufacturing practices in Japan  

Microsoft Academic Search

Purpose  This study presents case studies of selected remanufacturing operations in Japan. It investigates Japanese companies' motives\\u000a and incentives for remanufacturing, clarifies the requirements and obstacles facing remanufacturers, itemizes what measures\\u000a companies take to address them, and discusses the influence of Japanese laws related to remanufacturing.\\u000a \\u000a \\u000a \\u000a \\u000a Methods  This study involves case studies of four product areas: photocopiers, single-use cameras, auto parts, and

Mitsutaka Matsumoto; Yasushi Umeda



Fourier methods for biosequence analysis.  

PubMed Central

Novel methods are discussed for using fast Fourier transforms for DNA or protein sequence comparison. These methods are also intended as a contribution to the more general computer science problem of text search. These methods extend the capabilities of previous FFT methods and show that these methods are capable of considerable refinement. In particular, novel methods are given which (1) enable the detection of clusters of matching letters, (2) facilitate the insertion of gaps to enhance sequence similarity, and (3) accommodate to varying densities of letters in the input sequences. These methods use Fourier analysis in two distinct ways. (1) Fast Fourier transforms are used to facilitate rapid computation. (2) Fourier expansions are used to form an 'image' of the sequence comparison. PMID:2243777

Benson, D C



SAR/QSAR methods in public health practice  

SciTech Connect

Methods of (Quantitative) Structure-Activity Relationship ((Q)SAR) modeling play an important and active role in ATSDR programs in support of the Agency mission to protect human populations from exposure to environmental contaminants. They are used for cross-chemical extrapolation to complement the traditional toxicological approach when chemical-specific information is unavailable. SAR and QSAR methods are used to investigate adverse health effects and exposure levels, bioavailability, and pharmacokinetic properties of hazardous chemical compounds. They are applied as a part of an integrated systematic approach in the development of Health Guidance Values (HGVs), such as ATSDR Minimal Risk Levels, which are used to protect populations exposed to toxic chemicals at hazardous waste sites. (Q)SAR analyses are incorporated into ATSDR documents (such as the toxicological profiles and chemical-specific health consultations) to support environmental health assessments, prioritization of environmental chemical hazards, and to improve study design, when filling the priority data needs (PDNs) as mandated by Congress, in instances when experimental information is insufficient. These cases are illustrated by several examples, which explain how ATSDR applies (Q)SAR methods in public health practice.

Demchuk, Eugene, E-mail:; Ruiz, Patricia; Chou, Selene; Fowler, Bruce A.



Extended morphological processing: a practical method for automatic spot detection of biological markers from microscopic images  

PubMed Central

Background A reliable extraction technique for resolving multiple spots in light or electron microscopic images is essential in investigations of the spatial distribution and dynamics of specific proteins inside cells and tissues. Currently, automatic spot extraction and characterization in complex microscopic images poses many challenges to conventional image processing methods. Results A new method to extract closely located, small target spots from biological images is proposed. This method starts with a simple but practical operation based on the extended morphological top-hat transformation to subtract an uneven background. The core of our novel approach is the following: first, the original image is rotated in an arbitrary direction and each rotated image is opened with a single straight line-segment structuring element. Second, the opened images are unified and then subtracted from the original image. To evaluate these procedures, model images of simulated spots with closely located targets were created and the efficacy of our method was compared to that of conventional morphological filtering methods. The results showed the better performance of our method. The spots of real microscope images can be quantified to confirm that the method is applicable in a given practice. Conclusions Our method achieved effective spot extraction under various image conditions, including aggregated target spots, poor signal-to-noise ratio, and large variations in the background intensity. Furthermore, it has no restrictions with respect to the shape of the extracted spots. The features of our method allow its broad application in biological and biomedical image information analysis. PMID:20615231



Intelligent Best Practices Analysis Shahab D. Mohaghegh, Ph.D.  

E-print Network

Intelligent Best Practices Analysis Shahab D. Mohaghegh, Ph.D. West Virginia University 1 A New. Mohaghegh, Ph.D. West Virginia University 2 was performed both for gas and oil bearing formations. The Best Shahab D. Mohaghegh, Ph.D. West Virginia University 3 algorithms and fuzzy logic to achieve its objective

Mohaghegh, Shahab


A Deliberate Practice Approach to Teaching Phylogenetic Analysis  

ERIC Educational Resources Information Center

One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we…

Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.



A Model of Practice in Special Education: Dynamic Ecological Analysis  

ERIC Educational Resources Information Center

Dynamic Ecological Analysis (DEA) is a model of practice which increases a teams' efficacy by enabling the development of more effective interventions through collaboration and collective reflection. This process has proved to be useful in: a) clarifying thinking and problem-solving, b) transferring knowledge and thinking to significant parties,…

Hannant, Barbara; Lim, Eng Leong; McAllum, Ruth



Contemporary Nutritional Attitudes and Practices: A Factor Analysis Approach  

Microsoft Academic Search

The results reported here are based upon a survey of the nutritional attitudes and practices of a sample of adults aged between 18 and 74 years. The scaled responses to two inventories of statements were subjected to a factor analysis in order to assess the extent to which it is possible to identify a set of coherent dimensions which underly




Methods for genetic linkage analysis using trisomies  

SciTech Connect

Certain genetic disorders (e.g. congenital cataracts, duodenal atresia) are rare in the general population, but more common in people with Down`s syndrome. We present a method for using individuals with trisomy 21 to map genes for such traits. Our methods are analogous to methods for mapping autosomal dominant traits using affected relative pairs by looking for markers with greater than expected identity-by-descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected reduction to homozygosity in the chromosomes inherited form the non-disjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the gene, a confidence interval for that distance, and methods for computing power and sample sizes. The methods are described in the context of gene-dosage model for the etiology of the disorder, but can be extended to other models. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers, how to test candidate genes, and how to handle the effect of reduced recombination associated with maternal meiosis I non-disjunction.

Feingold, E.; Lamb, N.E.; Sherman, S.L. [Emory Univ., Atlanta, GA (United States)



Practical, fast Monte Carlo statistical static timing analysis: why and how  

Microsoft Academic Search

Statistical static timing analysis (SSTA) has emerged as an es- sential tool for nanoscale designs. Monte Carlo methods are uni- versally employed to validate the accuracy of the approximations made in all SSTA tools, but Monte Carlo itself is never employed as a strategy for practical SSTA. It is widely believed to be \\

Amith Singhee; Sonia Singhal; Rob A. Rutenbar



Practical, fast Monte Carlo statistical static timing analysis: Why and how  

Microsoft Academic Search

Statistical static timing analysis (SSTA) has emerged as an essential tool for nanoscale designs. Monte Carlo methods are universally employed to validate the accuracy of the approximations made in all SSTA tools, but Monte Carlo itself is never employed as a strategy for practical SSTA. It is widely believed to be ldquotoo slowrdquo - despite an uncomfortable lack of rigorous

Amith Singhee; Sonia Singhal; Rob A. Rutenbar



A Meta-Analysis of Published School Social Work Practice Studies: 1980-2007  

ERIC Educational Resources Information Center

Objective: This systematic review examined the effectiveness of school social work practices using meta-analytic techniques. Method: Hierarchical linear modeling software was used to calculate overall effect size estimates as well as test for between-study variability. Results: A total of 21 studies were included in the final analysis.…

Franklin, Cynthia; Kim, Johnny S.; Tripodi, Stephen J.



Practical evaluation of Mung bean seed pasteurization method in Japan.  


The majority of the seed sprout-related outbreaks have been associated with Escherichia coli O157:H7 and Salmonella. Therefore, an effective method for inactivating these organisms on the seeds before sprouting is needed. The current pasteurization method for mung beans in Japan (hot water treatment at 85 degrees C for 10 s) was more effective for disinfecting inoculated E. coli O157:H7, Salmonella, and nonpathogenic E. coli on mung bean seeds than was the calcium hypochlorite treatment (20,000 ppm for 20 min) recommended by the U.S. Food and Drug Administration. Hot water treatment at 85 degrees C for 40 s followed by dipping in cold water for 30 s and soaking in chlorine water (2,000 ppm) for 2 h reduced the pathogens to undetectable levels, and no viable pathogens were found in a 25-g enrichment culture and during the sprouting process. Practical tests using a working pasteurization machine with nonpathogenic E. coli as a surrogate produced similar results. The harvest yield of the treated seed was within the acceptable range. These treatments could be a viable alternative to the presently recommended 20,000-ppm chlorine treatment for mung bean seeds. PMID:20377967

Bari, M L; Enomoto, K; Nei, D; Kawamoto, S



Diagnostic Methods for Bile Acid Malabsorption in Clinical Practice  

PubMed Central

Altered bile acid (BA) concentrations in the colon may cause diarrhea or constipation. BA malabsorption (BAM) accounts for >25% of patients with irritable bowel syndrome (IBS) with diarrhea and chronic diarrhea in Western countries. As BAM is increasingly recognized, proper diagnostic methods are desired in clinical practice to help direct the most effective treatment course for the chronic bowel dysfunction. This review appraises the methodology, advantages and disadvantages of 4 tools that directly measure BAM: 14C-glycocholate breath and stool test, 75Selenium HomotauroCholic Acid Test (SeHCAT), 7 ?-hydroxy-4-cholesten-3-one (C4) and fecal BAs. 14C-glycocholate is a laborious test no longer widely utilized. 75SeHCAT is validated, but not available in the United States. Serum C4 is a simple, accurate method that is applicable to a majority of patients, but requires further clinical validation. Fecal measurements to quantify total and individual fecal BAs are technically cumbersome and not widely available. Regrettably, none of these tests are routinely available in the U.S., and a therapeutic trial with a BA binder is used as a surrogate for diagnosis of BAM. Recent data suggest there is an advantage to studying fecal excretion of the individual BAs and their role in BAM; this may constitute a significant advantage of the fecal BA method over the other tests. Fecal BA test could become a routine addition to fecal fat measurement in patients with unexplained diarrhea. In summary, availability determines the choice of test among C4, SeHCAT and fecal BA; more widespread availability of such tests would enhance clinical management of these patients. PMID:23644387

Vijayvargiya, Priya; Camilleri, Michael; Shin, Andrea; Saenger, Amy



Method of photon spectral analysis  


A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.



Method of photon spectral analysis  


A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

Gehrke, Robert J. (Idaho Falls, ID); Putnam, Marie H. (Idaho Falls, ID); Killian, E. Wayne (Idaho Falls, ID); Helmer, Richard G. (Idaho Falls, ID); Kynaston, Ronnie L. (Blackfoot, ID); Goodwin, Scott G. (Idaho Falls, ID); Johnson, Larry O. (Pocatello, ID)



Exhaled breath analysis: physical methods, instruments, and medical diagnostics  

NASA Astrophysics Data System (ADS)

This paper reviews the analysis of exhaled breath, a rapidly growing field in noninvasive medical diagnostics that lies at the intersection of physics, chemistry, and medicine. Current data are presented on gas markers in human breath and their relation to human diseases. Various physical methods for breath analysis are described. It is shown how measurement precision and data volume requirements have stimulated technological developments and identified the problems that have to be solved to put this method into clinical practice.

Vaks, V. L.; Domracheva, E. G.; Sobakinskaya, E. A.; Chernyaeva, M. B.



A practical method for determining organ dose during CT examination.  


A practical method, based on depth dose, for determining organ dose during computed tomography (CT) examination is presented. For 4-slice spiral CT scans, performed at radii of 0, 37.5, 75.0, 112.5, and 150.0 mm, measurement of depth dose has been made using thermoluminescent dosimeters (TLDs) inserted into a modified International Electrotechnical Commission (IEC) standard dosimetry phantom and also additional TLDs placed on the surface of the phantom. A regression equation-linking dose with distance from the center of the phantom has been formulated, from which dose to a point of interest relative to the surface dose can also be calculated. The approximation reflects the attenuation properties of X-rays in the phantom. Using the equation, an estimate of organ dose can be ascertained for CT examination, assuming water equivalence of human tissue and a known organ position and volume. Using the 4-slice spiral scanner, relative doses to a patients' lung have been calculated, the location and size of the lung in vivo being found from the CT scan image, and the lung being divided into 38 segments to calculate the relative dose. Results from our test case show the dose to the lung to have been 69+/-13% of surface dose. PMID:16979343

Cheung, Tsang; Cheng, Qijun; Feng, Dinghua



Meta-analysis: Historical Origins and Contemporary Practice.  

ERIC Educational Resources Information Center

The early and recent history of meta-analysis is outlined. After providing a definition of meta-analysis and listing its major characteristics, developments in statistics and research are described that influenced the formulation of modern meta-analytic methods. Major meta-analytic methods currently in use are described. Statistical and other…

Kulik, James A.; Kulik, Chen-Lin C.


Adapting Job Analysis Methodology to Improve Evaluation Practice  

ERIC Educational Resources Information Center

This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

Jenkins, Susan M.; Curtin, Patrick



44 CFR 9.9 - Analysis and reevaluation of practicable alternatives.  

Code of Federal Regulations, 2010 CFR

...FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.9 Analysis and reevaluation...requirements to avoid floodplains and wetlands unless there is no practicable alternative...practicable alternative, or the floodplain or wetland is itself not a practicable...



44 CFR 9.9 - Analysis and reevaluation of practicable alternatives.  

Code of Federal Regulations, 2011 CFR

...FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.9 Analysis and reevaluation...requirements to avoid floodplains and wetlands unless there is no practicable alternative...practicable alternative, or the floodplain or wetland is itself not a practicable...



44 CFR 9.9 - Analysis and reevaluation of practicable alternatives.  

Code of Federal Regulations, 2013 CFR

...FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.9 Analysis and reevaluation...requirements to avoid floodplains and wetlands unless there is no practicable alternative...practicable alternative, or the floodplain or wetland is itself not a practicable...



44 CFR 9.9 - Analysis and reevaluation of practicable alternatives.  

Code of Federal Regulations, 2014 CFR

...FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.9 Analysis and reevaluation...requirements to avoid floodplains and wetlands unless there is no practicable alternative...practicable alternative, or the floodplain or wetland is itself not a practicable...



44 CFR 9.9 - Analysis and reevaluation of practicable alternatives.  

Code of Federal Regulations, 2012 CFR

...FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.9 Analysis and reevaluation...requirements to avoid floodplains and wetlands unless there is no practicable alternative...practicable alternative, or the floodplain or wetland is itself not a practicable...



Flow methods in chiral analysis.  


The methods used for the separation and analytical determination of individual isomers are based on interactions with substances exhibiting optical activity. The currently used methods for the analysis of optically active compounds are primarily high-performance separation methods, such as gas and liquid chromatography using chiral stationary phases or chiral selectors in the mobile phase, and highly efficient electromigration techniques, such as capillary electrophoresis using chiral selectors. Chemical sensors and biosensors may also be designed for the analysis of optically active compounds. As enantiomers of the same compound are characterised by almost identical physico-chemical properties, their differentiation/separation in one-step unit operation in steady-state or dynamic flow systems requires the use of highly effective chiral selectors. Examples of such determinations are reviewed in this paper, based on 105 references. The greatest successes for isomer determination involve immunochemical interactions, enantioselectivity of the enzymatic biocatalytic processes, and interactions with ion-channel receptors or molecularly imprinted polymers. Conducting such processes under dynamic flow conditions may significantly enhance the differences in the kinetics of such processes, leading to greater differences in the signals recorded for enantiomers. Such determinations in flow conditions are effectively performed using surface-plasmon resonance and piezoelectric detections, as well as using common spectroscopic and electrochemical detections. PMID:24139575

Trojanowicz, Marek; Kaniewska, Marzena



Practical Aspects of the Equation-Error Method for Aircraft Parameter Estimation  

NASA Technical Reports Server (NTRS)

Various practical aspects of the equation-error approach to aircraft parameter estimation were examined. The analysis was based on simulated flight data from an F-16 nonlinear simulation, with realistic noise sequences added to the computed aircraft responses. This approach exposes issues related to the parameter estimation techniques and results, because the true parameter values are known for simulation data. The issues studied include differentiating noisy time series, maximum likelihood parameter estimation, biases in equation-error parameter estimates, accurate computation of estimated parameter error bounds, comparisons of equation-error parameter estimates with output-error parameter estimates, analyzing data from multiple maneuvers, data collinearity, and frequency-domain methods.

Morelli, Eugene a.



Voltammetric analysis apparatus and method  


An apparatus and method is described for electrochemical analysis of elements in solution. An auxiliary electrode, a reference electrode, and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and the auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between the auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in the sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

Almon, A.C.



Neuroscience and the Feldenkrais Method: evidence in research and clinical practice  

E-print Network

@ Some say evidence-based practice stifles the creative therapies and learning modalitiesNeuroscience and the Feldenkrais Method: evidence in research and clinical practice Associate. It draws on principles of exploratory practice rather than prescribed exercises and can work at different

Hickman, Mark


Practice patterns in FNA technique: A survey analysis  

PubMed Central

AIM: To ascertain fine needle aspiration (FNA) techniques by endosonographers with varying levels of experience and environments. METHODS: A survey study was performed on United States based endosonographers. The subjects completed an anonymous online electronic survey. The main outcome measurements were differences in needle choice, FNA technique, and clinical decision making among endosonographers and how this relates to years in practice, volume of EUS-FNA procedures, and practice environment. RESULTS: A total of 210 (30.8%) endosonographers completed the survey. Just over half (51.4%) identified themselves as academic/university-based practitioners. The vast majority of respondents (77.1%) identified themselves as high-volume endoscopic ultrasound (EUS) (> 150 EUS/year) and high-volume FNA (> 75 FNA/year) performers (73.3). If final cytology is non-diagnostic, high-volume EUS physicians were more likely than low volume physicians to repeat FNA with a core needle (60.5% vs 31.2%; P = 0.0004), and low volume physicians were more likely to refer patients for either surgical or percutaneous biopsy, (33.4% vs 4.9%, P < 0.0001). Academic physicians were more likely to repeat FNA with a core needle (66.7%) compared to community physicians (40.2%, P < 0.001). CONCLUSION: There is significant variation in EUS-FNA practices among United States endosonographers. Differences appear to be related to EUS volume and practice environment. PMID:25324922

DiMaio, Christopher J; Buscaglia, Jonathan M; Gross, Seth A; Aslanian, Harry R; Goodman, Adam J; Ho, Sammy; Kim, Michelle K; Pais, Shireen; Schnoll-Sussman, Felice; Sethi, Amrita; Siddiqui, Uzma D; Robbins, David H; Adler, Douglas G; Nagula, Satish



Cluster analysis in community research: Epistemology and practice  

Microsoft Academic Search

Cluster analysis refers to a family of methods for identifying cases with distinctive characteristics in heterogeneous samples and combining them into homogeneous groups. This approach provides a great deal of information about the types of cases and the distributions of variables in a sample. This paper considers cluster analysis as a quantitative complement to the traditional linear statistics that often

Bruce D. Rapkin; Douglas A. Luke



Canonical Correlation Analysis: An Explanation with Comments on Correct Practice.  

ERIC Educational Resources Information Center

This paper briefly explains the logic underlying the basic calculations employed in canonical correlation analysis. A small hypothetical data set is employed to illustrate that canonical correlation analysis subsumes both univariate and multivariate parametric methods. Several real data sets are employed to illustrate other themes. Three common…

Thompson, Bruce


Coal Field Fire Fighting - Practiced methods, strategies and tactics  

NASA Astrophysics Data System (ADS)

Subsurface coal fires destroy millions of tons of coal each year, have an immense impact to the ecological surrounding and threaten further coal reservoirs. Due to enormous dimensions a coal seam fire can develop, high operational expenses are needed. As part of the Sino-German coal fire research initiative "Innovative technologies for exploration, extinction and monitoring of coal fires in Northern China" the research team of University of Wuppertal (BUW) focuses on fire extinction strategies and tactics as well as aspects of environmental and health safety. Besides the choice and the correct application of different extinction techniques further factors are essential for the successful extinction. Appropriate tactics, well trained and protected personnel and the choice of the best fitting extinguishing agents are necessary for the successful extinction of a coal seam fire. The chosen strategy for an extinction campaign is generally determined by urgency and importance. It may depend on national objectives and concepts of coal conservation, on environmental protection (e.g. commitment to green house gases (GHG) reductions), national funding and resources for fire fighting (e.g. personnel, infrastructure, vehicles, water pipelines); and computer-aided models and simulations of coal fire development from self ignition to extinction. In order to devise an optimal fire fighting strategy, "aims of protection" have to be defined in a first step. These may be: - directly affected coal seams; - neighboring seams and coalfields; - GHG emissions into the atmosphere; - Returns on investments (costs of fire fighting compared to value of saved coal). In a further step, it is imperative to decide whether the budget shall define the results, or the results define the budget; i.e. whether there are fixed objectives for the mission that will dictate the overall budget, or whether the limited resources available shall set the scope within which the best possible results shall be achieved. For an effective and efficient fire fighting optimal tactics are requiered and can be divided into four fundamental tactics to control fire hazards: - Defense (digging away the coal, so that the coal can not begin to burn; or forming a barrier, so that the fire can not reach the not burning coal), - Rescue the coal (coal mining of a not burning seam), - Attack (active and direct cooling of burning seam), - Retreat (only monitoring till self-extinction of a burning seam). The last one is used when a fire exceeds the organizational and/or technical scope of a mission. In other words, "to control a coal fire" does not automatically and in all situations mean "to extinguish a coal fire". Best-practice tactics or a combination of them can be selected for control of a particular coal fire. For the extinguishing works different extinguishing agents are available. They can be applied by different application techniques and varying distinctive operating expenses. One application method may be the drilling of boreholes from the surface or covering the surface with low permeability soils. The mainly used extinction agents for coal field fire are as followed: Water (with or without additives), Slurry, Foaming mud/slurry, Inert gases, Dry chemicals and materials and Cryogenic agents. Because of its tremendous dimension and its complexity the worldwide challenge of coal fires is absolutely unique - it can only be solved with functional application methods, best fitting strategies and tactics, organisation and research as well as the dedication of the involved fire fighters, who work under extreme individual risks on the burning coal fields.

Wündrich, T.; Korten, A. A.; Barth, U. H.



Overhead analysis in a surgical practice: a brief communication.  


Evaluating overhead is an essential part of any business, including that of the surgeon. By examining each component of overhead, the surgeon will have a better grasp of the profitability of his or her practice. The overhead discussed in this article includes health insurance, overtime, supply costs, rent, advertising and marketing, telephone costs, and malpractice insurance. While the importance of evaluating and controlling overhead in a business is well understood, few know that overhead increases do not always imply increased expenses. National standards have been provided by the Medical Group Management Association. One method of evaluating overhead is to calculate the amount spent in terms of percent of net revenue. Net revenue includes income from patients, from interest, and from insurers less refunds. Another way for surgeons to evaluate their practice is to calculate income and expenses for two years, then calculate the variance between the two years and the percentage of variance to see where they stand. PMID:16968190

Frezza, Eldo E



Parallel Processable Cryptographic Methods with Unbounded Practical Security.  

ERIC Educational Resources Information Center

Addressing the problem of protecting confidential information and data stored in computer databases from access by unauthorized parties, this paper details coding schemes which present such astronomical work factors to potential code breakers that security breaches are hopeless in any practical sense. Two procedures which can be used to encode for…

Rothstein, Jerome


Comparison of Manual Versus Automated Data Collection Method for an Evidence-Based Nursing Practice Study  

PubMed Central

Objective The objective of this study was to investigate and improve the use of automated data collection procedures for nursing research and quality assurance. Methods A descriptive, correlational study analyzed 44 orthopedic surgical patients who were part of an evidence-based practice (EBP) project examining post-operative oxygen therapy at a Midwestern hospital. The automation work attempted to replicate a manually-collected data set from the EBP project. Results Automation was successful in replicating data collection for study data elements that were available in the clinical data repository. The automation procedures identified 32 “false negative” patients who met the inclusion criteria described in the EBP project but were not selected during the manual data collection. Automating data collection for certain data elements, such as oxygen saturation, proved challenging because of workflow and practice variations and the reliance on disparate sources for data abstraction. Automation also revealed instances of human error including computational and transcription errors as well as incomplete selection of eligible patients. Conclusion Automated data collection for analysis of nursing-specific phenomenon is potentially superior to manual data collection methods. Creation of automated reports and analysis may require initial up-front investment with collaboration between clinicians, researchers and information technology specialists who can manage the ambiguities and challenges of research and quality assurance work in healthcare. PMID:23650488

Byrne, M.D.; Jordan, T.R.; Welle, T.



Promoting nurses' knowledge in evidence-based practice: do educational methods matter?  


Evidence-based practice (EBP) is a mandate for nursing practice. Education on EBP has occurred in academic settings, but not all nurses have received this training. The authors describe a randomized controlled pretest/posttest design testing the differences in effectiveness of two educational methods to improve nurses' knowledge, attitudes, and practice of EBP. Results indicated both methods improved self-reported practice. On the basis of the study findings, staff development educators can select the teaching method that best complements their organizational environment. PMID:23877287

Toole, Belinda M; Stichler, Jaynelle F; Ecoff, Laurie; Kath, Lisa



Articulating current service development practices: a qualitative analysis of eleven mental health projects  

PubMed Central

Background The utilisation of good design practices in the development of complex health services is essential to improving quality. Healthcare organisations, however, are often seriously out of step with modern design thinking and practice. As a starting point to encourage the uptake of good design practices, it is important to understand the context of their intended use. This study aims to do that by articulating current health service development practices. Methods Eleven service development projects carried out in a large mental health service were investigated through in-depth interviews with six operation managers. The critical decision method in conjunction with diagrammatic elicitation was used to capture descriptions of these projects. Stage-gate design models were then formed to visually articulate, classify and characterise different service development practices. Results Projects were grouped into three categories according to design process patterns: new service introduction and service integration; service improvement; service closure. Three common design stages: problem exploration, idea generation and solution evaluation - were then compared across the design process patterns. Consistent across projects were a top-down, policy-driven approach to exploration, underexploited idea generation and implementation-based evaluation. Conclusions This study provides insight into where and how good design practices can contribute to the improvement of current service development practices. Specifically, the following suggestions for future service development practices are made: genuine user needs analysis for exploration; divergent thinking and innovative culture for idea generation; and fail-safe evaluation prior to implementation. Better training for managers through partnership working with design experts and researchers could be beneficial. PMID:24438471



Flow analysis system and method  

NASA Technical Reports Server (NTRS)

A non-invasive flow analysis system and method wherein a sensor, such as an acoustic sensor, is coupled to a conduit for transmitting a signal which varies depending on the characteristics of the flow in the conduit. The signal is amplified and there is a filter, responsive to the sensor signal, and tuned to pass a narrow band of frequencies proximate the resonant frequency of the sensor. A demodulator generates an amplitude envelope of the filtered signal and a number of flow indicator quantities are calculated based on variations in amplitude of the amplitude envelope. A neural network, or its equivalent, is then used to determine the flow rate of the flow in the conduit based on the flow indicator quantities.

Hill, Wayne S. (Inventor); Barck, Bruce N. (Inventor)



A Practical Guide to Interpretation of Large Collections of Incident Narratives Using the QUORUM Method  

NASA Technical Reports Server (NTRS)

Analysis of incident reports plays an important role in aviation safety. Typically, a narrative description, written by a participant, is a central part of an incident report. Because there are so many reports, and the narratives contain so much detail, it can be difficult to efficiently and effectively recognize patterns among them. Recognizing and addressing recurring problems, however, is vital to continuing safety in commercial aviation operations. A practical way to interpret large collections of incident narratives is to apply the QUORUM method of text analysis, modeling, and relevance ranking. In this paper, QUORUM text analysis and modeling are surveyed, and QUORUM relevance ranking is described in detail with many examples. The examples are based on several large collections of reports from the Aviation Safety Reporting System (ASRS) database, and a collection of news stories describing the disaster of TWA Flight 800, the Boeing 747 which exploded in mid- air and crashed near Long Island, New York, on July 17, 1996. Reader familiarity with this disaster should make the relevance-ranking examples more understandable. The ASRS examples illustrate the practical application of QUORUM relevance ranking.

McGreevy, Michael W.



Analysis of flight equipment purchasing practices of representative air carriers  

NASA Technical Reports Server (NTRS)

The process through which representative air carriers decide whether or not to purchase flight equipment was investigated as well as their practices and policies in retiring surplus aircraft. An analysis of the flight equipment investment decision process in ten airlines shows that for the airline industry as a whole, the flight equipment investment decision is in a state of transition from a wholly informal process in earliest years to a much more organized and structured process in the future. Individual air carriers are in different stages with respect to the formality and sophistication associated with the flight equipment investment decision.



On Practical Results of the Differential Power Analysis  

NASA Astrophysics Data System (ADS)

This paper describes practical differential power analysis attacks. There are presented successful and unsuccessful attack attempts with the description of the attack methodology. It provides relevant information about oscilloscope settings, optimization possibilities and fundamental attack principles, which are important when realizing this type of attack. The attack was conducted on the PIC18F2420 microcontroller, using the AES cryptographic algorithm in the ECB mode with the 128-bit key length. We used two implementations of this algorithm - in the C programming language and in the assembler.

Breier, Jakub; Kleja, Marcel



Assessing Student Perception of Practice Evaluation Knowledge in Introductory Research Methods  

ERIC Educational Resources Information Center

The authors explored the use of the Practice Evaluation Knowledge Scale (PEKS) to assess student perception of acquisition and retention of practice evaluation knowledge from an undergraduate research methods class. The authors sampled 2 semesters of undergraduate social work students enrolled in an introductory research methods course.…

Baker, Lisa R.; Pollio, David E.; Hudson, Ashley



Investigating the Efficacy of Practical Skill Teaching: A Pilot-Study Comparing Three Educational Methods  

ERIC Educational Resources Information Center

Effective education of practical skills can alter clinician behaviour, positively influence patient outcomes, and reduce the risk of patient harm. This study compares the efficacy of two innovative practical skill teaching methods, against a traditional teaching method. Year three pre-clinical physiotherapy students consented to participate in a…

Maloney, Stephen; Storr, Michael; Paynter, Sophie; Morgan, Prue; Ilic, Dragan



Researching into Teaching Methods in Colleges and Universities. Practical Research Series.  

ERIC Educational Resources Information Center

This practical guide is one of a series aimed at assisting academics in higher education in researching specific aspects of their work. Focusing on small-scale insider research in colleges and universities, the handbook covers contemporary issues, research methods, and existing practice and values in the area of teaching methods. Strategies for…

Bennett, Clinton; And Others


Return on Investment in Electronic Health Records in Primary Care Practices: A Mixed-Methods Study  

PubMed Central

Background The use of electronic health records (EHR) in clinical settings is considered pivotal to a patient-centered health care delivery system. However, uncertainty in cost recovery from EHR investments remains a significant concern in primary care practices. Objective Guided by the question of “When implemented in primary care practices, what will be the return on investment (ROI) from an EHR implementation?”, the objectives of this study are two-fold: (1) to assess ROI from EHR in primary care practices and (2) to identify principal factors affecting the realization of positive ROI from EHR. We used a break-even point, that is, the time required to achieve cost recovery from an EHR investment, as an ROI indicator of an EHR investment. Methods Given the complexity exhibited by most EHR implementation projects, this study adopted a retrospective mixed-method research approach, particularly a multiphase study design approach. For this study, data were collected from community-based primary care clinics using EHR systems. Results We collected data from 17 primary care clinics using EHR systems. Our data show that the sampled primary care clinics recovered their EHR investments within an average period of 10 months (95% CI 6.2-17.4 months), seeing more patients with an average increase of 27% in the active-patients-to-clinician-FTE (full time equivalent) ratio and an average increase of 10% in the active-patients-to-clinical-support-staff-FTE ratio after an EHR implementation. Our analysis suggests, with a 95% confidence level, that the increase in the number of active patients (P=.006), the increase in the active-patients-to-clinician-FTE ratio (P<.001), and the increase in the clinic net revenue (P<.001) are positively associated with the EHR implementation, likely contributing substantially to an average break-even point of 10 months. Conclusions We found that primary care clinics can realize a positive ROI with EHR. Our analysis of the variances in the time required to achieve cost recovery from EHR investments suggests that a positive ROI does not appear automatically upon implementing an EHR and that a clinic’s ability to leverage EHR for process changes seems to play a role. Policies that provide support to help primary care practices successfully make EHR-enabled changes, such as support of clinic workflow optimization with an EHR system, could facilitate the realization of positive ROI from EHR in primary care practices. PMID:25600508

Sanche, Steven



Image Analysis Model-Based Methods  

E-print Network

Image Analysis Model-Based Methods Comparing and Evaluating Models Summary Further Reading Fully Bayesian Analysis of Low-Count Astronomical Images David A. van Dyk1 Alanna Connors2 1Department Low-Count Image Analysis #12;Image Analysis Model-Based Methods Comparing and Evaluating Models

Wolfe, Patrick J.


Good practices in LIBS analysis: Review and advices  

NASA Astrophysics Data System (ADS)

This paper presents a review on the analytical results obtained by laser-induced breakdown spectroscopy (LIBS). In the first part, results on identification and classification of samples are presented including the risk of misclassification, and in the second part, results on concentration measurement based on calibration are accompanied with significant figures of merit including the concept of accuracy. Both univariate and multivariate approaches are discussed with special emphasis on the methodology, the way of presenting the results and the assessment of the methods. Finally, good practices are proposed for both classification and concentration measurement.

El Haddad, J.; Canioni, L.; Bousquet, B.



Investigating the efficacy of practical skill teaching: a pilot-study comparing three educational methods.  


Effective education of practical skills can alter clinician behaviour, positively influence patient outcomes, and reduce the risk of patient harm. This study compares the efficacy of two innovative practical skill teaching methods, against a traditional teaching method. Year three pre-clinical physiotherapy students consented to participate in a randomised controlled trial, with concealed allocation and blinded participants and outcome assessment. Each of the three randomly allocated groups were exposed to a different practical skills teaching method (traditional, pre-recorded video tutorial or student self-video) for two specific practical skills during the semester. Clinical performance was assessed using an objective structured clinical examination (OSCE). The students were also administered a questionnaire to gain the participants level of satisfaction with the teaching method, and their perceptions of the teaching methods educational value. There were no significant differences in clinical performance between the three practical skill teaching methods as measured in the OSCE, or for student ratings of satisfaction. A significant difference existed between the methods for the student ratings of perceived educational value, with the teaching approaches of pre-recorded video tutorial and student self-video being rated higher than 'traditional' live tutoring. Alternative teaching methods to traditional live tutoring can produce equivalent learning outcomes when applied to the practical skill development of undergraduate health professional students. The use of alternative practical skill teaching methods may allow for greater flexibility for both staff and infrastructure resource allocation. PMID:22354336

Maloney, Stephen; Storr, Michael; Paynter, Sophie; Morgan, Prue; Ilic, Dragan



Polydispersity analysis of Taylor dispersion data: the cumulant method  

E-print Network

Taylor dispersion analysis is an increasingly popular characterization method that measures the diffusion coefficient, and hence the hydrodynamic radius, of (bio)polymers, nanoparticles or even small molecules. In this work, we describe an extension to current data analysis schemes that allows size polydispersity to be quantified for an arbitrary sample, thereby significantly enhancing the potentiality of Taylor dispersion analysis. The method is based on a cumulant development similar to that used for the analysis of dynamic light scattering data. Specific challenges posed by the cumulant analysis of Taylor dispersion data are discussed, and practical ways to address them are proposed. We successfully test this new method by analyzing both simulated and experimental data for solutions of moderately polydisperse polymers and polymer mixtures.

Luca Cipelletti; Jean-Philippe Biron; Michel Martin; Hervé Cottet



Methods of stability analysis in nonlinear mechanics  

SciTech Connect

We review our recent work on methods to study stability in nonlinear mechanics, especially for the problems of particle accelerators, and compare our ideals to those of other authors. We emphasize methods that (1) show promise as practical design tools, (2) are effective when the nonlinearity is large, and (3) have a strong theoretical basis. 24 refs., 2 figs., 2 tabs.

Warnock, R.L.; Ruth, R.D.; Gabella, W.; Ecklund, K.



[Magnetocardiography in clinical practice: algorithms and technologies for data analysis].  


This methodological work is the first part of series of papers dedicated to the modern perspective method of non-invasive diagnostics in cardiology--magnetocardiography. Definition of magnetocardiography method is given, levels of magnetocardiography data analysis as well as electrophysiological models are described. The most informative biomarkers and technologies of qualitative and quantitative interpretation of current density distribution maps and curves of total current magnitude are presented. The step-by-step algorithm, which was used for the MCG-data analysis, is proposed. PMID:22416359

Cha?kovski?, I; Bo?chak, M; Sosnitski?, V; Miasnikov, G; Rykhlik, E; Sosnitskaia, T; Frolov, Iu; Budnik, V



Developing a practical toxicogenomics data analysis system utilizing open-source software.  


Comprehensive gene expression analysis has been applied to investigate the molecular mechanism of toxicity, which is generally known as toxicogenomics (TGx). When analyzing large-scale gene expression data obtained by microarray analysis, typical multivariate data analysis methods performed with commercial software such as hierarchical clustering or principal component analysis usually do not provide conclusive outputs by themselves. To best utilize the TGx data for toxicity evaluation in the drug development process, fit-for-purpose customization of the analytical algorithm with user-friendly interface and intuitive outputs are required to practically address the toxicologists' demands. However, commercial software is usually not very flexible in the customization of their functions or outputs. Owing to the recent advancement and accumulation of open-source software contributed by bioinformaticians all over the world, it becomes easier for us to develop practical and fit-for-purpose analytical software by ourselves with fairly low cost and efforts. The aim of this article is to present an example of developing an automated TGx data processing system (ATP system), which implements gene set-level analysis toxicogenomic profiling by D-score method and generates straightforward output that makes it easy to interpret the biological and toxicological significance of the TGx data. Our example will provide basic clues for readers to develop and customize their own TGx data analysis system which complements the function of existing commercial software. PMID:23086850

Hirai, Takehiro; Kiyosawa, Naoki



Imaging laser analysis of building materials - practical examples  

SciTech Connect

The Laser induced Breakdown Spectroscopy (LIBS) is supplement and extension of standard chemical methods and SEM- or Micro-RFA-applications for the evaluation of building materials. As a laboratory method LIBS is used to gain color coded images representing composition, distribution of characteristic ions and/or ingress characteristic of damaging substances. To create a depth profile of element concentration a core has to be taken and split along the core axis. LIBS was proven to be able to detect all important elements in concrete, e. g. Chlorine, Sodium or Sulfur, which are responsible for certain degradation mechanisms and also light elements like lithium or hydrogen. Practical examples are given and a mobile system for on-site measurements is presented.

Wilsch, G.; Schaurich, D.; Wiggenhauser, H. [BAM, Federal Institute for Materials Research and Testing, Berlin (Germany)



Propel: Tools and Methods for Practical Source Code Model Checking  

NASA Technical Reports Server (NTRS)

The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem



Scharz Preconditioners for Krylov Methods: Theory and Practice  

SciTech Connect

Several numerical methods were produced and analyzed. The main thrust of the work relates to inexact Krylov subspace methods for the solution of linear systems of equations arising from the discretization of partial di#11;erential equa- tions. These are iterative methods, i.e., where an approximation is obtained and at each step. Usually, a matrix-vector product is needed at each iteration. In the inexact methods, this product (or the application of a preconditioner) can be done inexactly. Schwarz methods, based on domain decompositions, are excellent preconditioners for thise systems. We contributed towards their under- standing from an algebraic point of view, developed new ones, and studied their performance in the inexact setting. We also worked on combinatorial problems to help de#12;ne the algebraic partition of the domains, with the needed overlap, as well as PDE-constraint optimization using the above-mentioned inexact Krylov subspace methods.

Szyld, Daniel B.



Mixed Methods in Nursing Research : An Overview and Practical Examples  

PubMed Central

Mixed methods research methodologies are increasingly applied in nursing research to strengthen the depth and breadth of understanding of nursing phenomena. This article describes the background and benefits of using mixed methods research methodologies, and provides two examples of nursing research that used mixed methods. Mixed methods research produces several benefits. The examples provided demonstrate specific benefits in the creation of a culturally congruent picture of chronic pain management for American Indians, and the determination of a way to assess cost for providing chronic pain care.

Doorenbos, Ardith Z.



A Function Analysis Model Based on Granular Computing and Practical Example in Product Data Management  

NASA Astrophysics Data System (ADS)

Effective product data management (PDM) platform is the critical environment to enhance the competitiveness of enterprises. In order to quickly establish a customer specified PDM platform for manufacturing enterprises, the guidance of effective methods is needed. This paper proposed a function analysis model based on granular computing to guide the establishment of PDM platform. This function analysis model can describe the dynamic mapping design process and the solution of design problem should be obtained through this process. To illuminate this model, a practical example of PDM in a company of auto parts manufacturing is provided.

Chi-lan, Cai; Yue-wei, Bai; Yan-chun, Xia; Xiao-gang, Wang; Kai, Liu


Adapting community based participatory research (CBPR) methods to the implementation of an asthma shared decision making intervention in ambulatory practices  

PubMed Central

Objective Translating research findings into clinical practice is a major challenge to improve the quality of healthcare delivery. Shared decision making (SDM) has been shown to be effective and has not yet been widely adopted by health providers. This paper describes the participatory approach used to adapt and implement an evidence-based asthma SDM intervention into primary care practices. Methods A participatory research approach was initiated through partnership development between practice staff and researchers. The collaborative team worked together to adapt and implement a SDM toolkit. Using the RE-AIM framework and qualitative analysis, we evaluated both the implementation of the intervention into clinical practice, and the level of partnership that was established. Analysis included the number of adopting clinics and providers, the patients’ perception of the SDM approach, and the number of clinics willing to sustain the intervention delivery after 1 year. Results All six clinics and physician champions implemented the intervention using half-day dedicated asthma clinics while 16% of all providers within the practices have participated in the intervention. Themes from the focus groups included the importance of being part the development process, belief that the intervention would benefit patients, and concerns around sustainability and productivity. One year after initiation, 100% of clinics have sustained the intervention, and 90% of participating patients reported a shared decision experience. Conclusions Use of a participatory research process was central to the successful implementation of a SDM intervention in multiple practices with diverse patient populations. PMID:24350877

Kuhn, Lindsay; Alkhazraji, Thamara; Steuerwald, Mark; Ludden, Tom; Wilson, Sandra; Mowrer, Lauren; Mohanan, Sveta; Dulin, Michael F.



A practical method of determining project risk contingency budgets  

Microsoft Academic Search

Abstract Given a collection of accepted risks with corresponding impacts and probabilities over the life of a project (or a relevant portion), a method to estimate the total potential impact at a given certainty is presented. Project sponsors and managers can decide their risk tolerance and set aside corresponding contingency funds. This analytic method uses a binomial distribution with a

D. F. Cioffi; H. Khamooshi



[Office hysteroscopy: a new examination method in gynecological practice].  


Hysteroscopy is a widely used endoscopic method, the "gold standard" for the examination of uterine cavity changes. Office hysteroscopy is an intervention that makes the accomplishment of ambulant examination, opposite to the traditional method. Due to the small diameter of the device the anesthesia is unnecessary, because there is no need of the dilatation of the cervix. Indications of the examination are wide-ranging. Besides the abnormal uterine bleeding, it can be used in the examination of infertility, and those intrauterine changes (polyp, submucosus myoma, adhesion), that were diagnosed by other imagining methods that cause infertility complaints. The aim of our present review is recommend the use of this method because it does not need any preparation and it is minimal invasive. The use of traditional method that needs longer preparation, observation, anesthesia and operating theatre, is only suggested in cases of proven pathology. PMID:21177231

Török, Péter; Major, Tamás



A mixed methods exploration of the team and organizational factors that may predict new graduate nurse engagement in collaborative practice.  


Although engagement in collaborative practice is reported to support the role transition and retention of new graduate (NG) nurses, it is not known how to promote collaborative practice among these nurses. This mixed methods study explored the team and organizational factors that may predict NG nurse engagement in collaborative practice. A total of 514 NG nurses from Ontario, Canada completed the Collaborative Practice Assessment Tool. Sixteen NG nurses participated in follow-up interviews. The team and organizational predictors of NG engagement in collaborative practice were as follows: satisfaction with the team (??=?0.278; p?=?0.000), number of team strategies (??=?0.338; p?=?0.000), participation in a mentorship or preceptorship experience (??=?0.137; p?=?0.000), accessibility of manager (??=?0.123; p?=?0.001), and accessibility and proximity of educator or professional practice leader (??=?0.126; p?=?0.001 and ??=?0.121; p?=?0.002, respectively). Qualitative analysis revealed the team facilitators to be respect, team support and face-to-face interprofessional interactions. Organizational facilitators included supportive leadership, participation in a preceptorship or mentorship experience and time. Interventions designed to facilitate NG engagement in collaborative practice should consider these factors. PMID:24195680

Pfaff, Kathryn A; Baxter, Pamela E; Ploeg, Jenny; Jack, Susan M



Pedagogical Practices and Counselor Self-Efficacy: A Mixed Methods Investigations  

ERIC Educational Resources Information Center

The current study investigated the Lecture Teaching Method and Socratic Teaching Method to determine if there was a relationship between pedagogical methods and Counselor Self-Efficacy (CSE). A course in Advanced Professional Development was utilized to determine if teaching methods could affect student perceptions of competence to practice

Brogan, Justin R.



[Evidence-based practices published in Brazil: identification and analysis of their types and methodological approches].  


This is an integrative review of Brazilian studies on evidence-based practices (EBP) in health, published in ISI/JCR journals in the last 10 years. The aim was to identify the specialty areas that most accomplished these studies, their foci and methodological approaches. Based on inclusion criteria, 144 studies were selected. The results indicate that most EBP studies addressed childhood and adolescence, infectious diseases, psychiatrics/mental health and surgery. The predominant foci were prevention, treatment/rehabilitation, diagnosis and assessment. The most used methods were systematic review with or without meta-analysis, protocol review or synthesis of available evidence studies, and integrative review. A strong multiprofessional expansion of EBP is found in Brazil, contributing to the search for more selective practices by collecting, recognizing and critically analyzing the produced knowledge. The study also contributes to the analysis itself of ways to do research and new research possibilities. PMID:21710089

Lacerda, Rúbia Aparecida; Nunes, Bruna Kosar; Batista, Arlete de Oliveira; Egry, Emiko Yoshikawa; Graziano, Kazuko Uchikawa; Angelo, Margareth; Merighi, Miriam Aparecida Barbosa; Lopes, Nadir Aparecida; Fonseca, Rosa Maria Godoy Serpa da; Castilho, Valéria



Dynamic mechanical analysis: A practical introduction to techniques and applications  

SciTech Connect

This book introduces DMA, its history, and its current position as part of thermal analysis on polymers. It discusses major types of instrumentation, including oscillatory rotational, oscillatory axial, and torsional pendulum. It also describes analytical techniques in terms of utility, quality of data, methods of calibration, and suitability for different types of materials and assesses applications for thermoplastics, thermosetting systems, and thermosets.

Menard, K.



Professional Suitability for Social Work Practice: A Factor Analysis  

ERIC Educational Resources Information Center

Objective: The purpose of this study was to identify the underlying dimensions of professional suitability. Method: Data were collected from a province-wide mail-out questionnaire surveying 341 participants from a random sample of registered social workers. Results: The use of an exploratory factor analysis identified a 5-factor solution on…

Tam, Dora M. Y.; Coleman, Heather; Boey, Kam-Wing



Error analysis in some recent versions of the Fry Method  

NASA Astrophysics Data System (ADS)

Fry method is a graphical technique that directly displays the strain ellipse in form of central vacancy on a point distribution, the Fry plot. For accurate strain estimation from the Fry plot, the central vacancy must appear as a sharply focused perfect ellipse. Diffused nature of the central vacancy, common in practice, induces a subjectivity in direct strain estimation from the Fry plot. Several alternative methods, based on the point density contrast, the image analysis, the Delaunay triangulation, or the point distribution analysis exist for objective strain estimation from the Fry plots. Relative merits and limitations of these methods are, however, not yet well-explored and understood. This study compares the accuracy and efficacy of the six methods proposed for objective determination of strain from Fry plots. Our approach consists of; (i) graphical simulation of variously sorted object sets, (ii) distortion of different object sets by known strain in pure shear, simple shear and simultaneous pure-and-simple shear deformations and, (iii) error analysis and comparison of the six methods. Results from more than 1000 tests reveal that the Delaunay triangulation method, the point density contrast methods or the image analysis method are relatively more accurate and versatile. The amount and nature of distortion, or the degree of sorting have little effect on the accuracy of results in these methods. The point distribution analysis methods are successful provided the pre-deformed objects were well-sorted and defined by the specific types of point distribution. Both the Delaunay triangulation method and the image analysis method are more time efficient in comparison to the point distribution analysis methods. The time-efficiency of the density contrast methods is in between these two extremes.

Srivastava, D. C.; Kumar, R.



Contemporary nutritional attitudes and practices: a factor analysis approach.  


The results reported here are based upon a survey of the nutritional attitudes and practices of a sample of adults aged between 18 and 74 years. The scaled responses to two inventories of statements were subjected to a factor analysis in order to assess the extent to which it is possible to identify a set of coherent dimensions which underly the range of "surface" issues which figure consistently in the sociological literature. The results broadly confirm the utility of the inventories, and do suggest the presence of a series of underlying themes, some of which are very much along anticipated lines. However, one theme, that of deference to what might be thought of as "authoritative agencies" within the food system, was less expected, and deserves further attention. Additionally, selected factors were aggregated by summing the scores of their component variables, and correlated with the key independent variables of age, sex and social class, with a view to identifying the social profiles of their adherents. The results obtained were by no means clear cut, with a number of the anticipated features of such profiles being absent. Moreover, where the profiles were as anticipated, the correlations, although statistically significant, were relatively weak. This raised the issue of whether such an outcome was a methodological artefact, or a reflection of the possibility that differences in nutritional attitudes and practices are shaped by a range of lifestyle variables which do not coincide with conventional indicators of social differentiation. PMID:9989923

Beardsworth, A; Haslam, C; Keil, T; Goode, J; Sherratt, E



Numerical Analysis Methods William J. Stewart  

E-print Network

Numerical Analysis Methods William J. Stewart Department of Computer Science North Carolina State of Performance Evaluation (PE), numerical analysis methods refer to those meth- ods which work with a Markov application of the numerical analysis approach to PE problems was in 1966 by Wallace and Rosenberg

Stewart, William J.


Moodtrack : practical methods for assembling emotion-driven music  

E-print Network

This thesis presents new methods designed for the deconstruction and reassembly of musical works based on a target emotional contour. Film soundtracks provide an ideal testing ground for organizing music around strict ...

Vercoe, G. Scott



Theory, Method and Practice of Neuroscientific Findings in Science Education  

ERIC Educational Resources Information Center

This report provides an overview of neuroscience research that is applicable for science educators. It first offers a brief analysis of empirical studies in educational neuroscience literature, followed by six science concept learning constructs based on the whole brain theory: gaining an understanding of brain function; pattern recognition and…

Liu, Chia-Ju; Chiang, Wen-Wei



Grounded action research: a method for understanding IT in practice  

Microsoft Academic Search

This paper shows how the theory development portion of action research can be made more rigorous. The process of theory formulation is an essential part of action research, yet this process is not well understood. A case study demonstrates how units of analysis and techniques from grounded theory can be integrated into the action research cycle in order to add

Richard Baskerville; Jan Pries-Heje



Practical method of diffusion-welding steel plate in air  

NASA Technical Reports Server (NTRS)

Method is ideal for critical service requirements where parent metal properties are equaled in notch toughness, stress rupture and other characteristics. Welding technique variations may be used on a variety of materials, such as carbon steels, alloy steels, stainless steels, ceramics, and reactive and refractory materials.

Holko, K. H.; Moore, T. J.



Methods in Educational Research: From Theory to Practice  

ERIC Educational Resources Information Center

Written for students, educators, and researchers, "Methods in Educational Research" offers a refreshing introduction to the principles of educational research. Designed for the real world of educational research, the book's approach focuses on the types of problems likely to be encountered in professional experiences. Reflecting the importance of…

Lodico, Marguerite G.; Spaulding Dean T.; Voegtle, Katherine H.



MAKING FORMAL METHODS PRACTICAL Marc Zimmerman, Mario Rodriguez, Benjamin Ingram,  

E-print Network

on supposed impracticality: Many consider formal methods to be an approach to system specification how to design formal specification languages that can be used for complex systems and require minimal training, we developed a formal specification of an English language specification of the vertical flight

Leveson, Nancy


Practical applications of activation analysis and other nuclear techniques  

SciTech Connect

Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of ..gamma.. rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed.

Lyon, W.S.



The Hitchhiker's guide to Hi-C analysis: Practical guidelines.  


Over the last decade, development and application of a set of molecular genomic approaches based on the chromosome conformation capture method (3C), combined with increasingly powerful imaging approaches, have enabled high resolution and genome-wide analysis of the spatial organization of chromosomes. The aim of this paper is to provide guidelines for analyzing and interpreting data obtained with genome-wide 3C methods such as Hi-C and 3C-seq that rely on deep sequencing to detect and quantify pairwise chromatin interactions. PMID:25448293

Lajoie, Bryan R; Dekker, Job; Kaplan, Noam



Maximizing Return From Sound Analysis and Design Practices  

SciTech Connect

With today's tightening budgets computer applications must provide "true" long-term benefit to the company. Businesses are spending large portions of their budgets "Re- Engineering" old systems to take advantage of "new" technology. But what they are really getting is simply a new interface implementing the same incomplete or poor defined requirements as before. "True" benefit can only be gained if sound analysis and design practices are used. WHAT data and processes are required of a system is not the same as HOW the system will be implemented within a company. It is the System Analyst's responsibility to understand the difference between these two concepts. The paper discusses some simple techniques to be used during the Analysis and Design phases of projects, as well as the information gathered and recorded in each phase and how it is transformed between these phases. The paper also covers production application generated using Oracle Designer. Applying these techniques to "real world" problems, the applications will meet the needs for today's business and adapt easily to ever-changing business environments.

Bramlette, Judith Lynn



Maximizing Return From Sound Analysis and Design Practices  

SciTech Connect

With today's tightening budgets computer applications must provide ''true'' long-term benefit to the company. Businesses are spending large portions of their budgets ''Re-Engineering'' old systems to take advantage of ''new'' technology. But what they are really getting is simply a new interface implementing the same incomplete or poor defined requirements as before. ''True'' benefit can only be gained if sound analysis and design practices are used. WHAT data and processes are required of a system is not the same as HOW the system will be implemented within a company. It is the System Analyst's responsibility to understand the difference between these two concepts. The paper discusses some simple techniques to be used during the Analysis and Design phases of projects, as well as the information gathered and recorded in each phase and how it is transformed between these phases. The paper also covers production application generated using Oracle Designer. Applying these techniques to ''real world'' problems, the applications will meet the needs for today's business and adapt easily to ever-changing business environments.

Bramlette, J.D.



Instrumental Methods of Chemical Analysis  

NSDL National Science Digital Library

This site includes resources for the instrumental analysis class at St Olaf's College. The syllabus, a sample exam, problem sets, a class calendar, and an introduction to the use of role playing in the class are provided.

Walters, John P.



A mixed methods approach to understand variation in lung cancer practice and the role of guidelines  

PubMed Central

Introduction Practice pattern data demonstrate regional variation and lower than expected rates of adherence to practice guideline (PG) recommendations for the treatment of stage II/IIIA resected and stage IIIA/IIIB unresected non-small cell lung cancer (NSCLC) patients in Ontario, Canada. This study sought to understand how clinical decisions are made for the treatment of these patients and the role of PGs. Methods Surveys and key informant interviews were undertaken with clinicians and administrators. Results Participants reported favorable ratings for PGs and the evidentiary bases underpinning them. The majority of participants agreed more patients should have received treatment and that regional variation is problematic. Participants estimated that up to 30% of patients are not good candidates for treatment and up to 20% of patients refuse treatment. The most common barrier to implementing PGs was the lack of organizational support by clinical administrative leadership. There was concern that the trial results underpinning the PG recommendations were not generalizable to the typical patients seen in clinic. The qualitative analysis yielded five themes related to physicians’ decision making: the unique patient, the unique physician, the family, the clinical team, and the clinical evidence. A dynamic interplay between these factors exists. Conclusion Our study demonstrates the challenges inherent in (i) the complexity of clinical decision making; (ii) how quality of care problems are perceived and operationalized; and (iii) the clinical appropriateness and utility of PG recommendations. We argue that systematic and rigorous methodologies to help decision makers mitigate or negotiate these challenges are warranted. PMID:24655753



This paper describes a graphical method of nonlinear circuit analysis. The method combines circuit analysis  

E-print Network

11 ABSTRACT This paper describes a graphical method of nonlinear circuit analysis. The method combines circuit analysis using driving-point impedances and signal flow graphs with distortion analysis using the Volterra series. The result is a method of distortion analysis which is more intuitive

Phang, Khoman


Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture  

SciTech Connect

Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

West, Phillip B. (Idaho Falls, ID); Novascone, Stephen R. (Idaho Falls, ID); Wright, Jerry P. (Idaho Falls, ID)



Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture  


Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

West, Phillip B. (Idaho Falls, ID); Novascone, Stephen R. (Idaho Falls, ID); Wright, Jerry P. (Idaho Falls, ID)



A Practical Blended Analysis for Dynamic Features in JavaScript  

E-print Network

A Practical Blended Analysis for Dynamic Features in JavaScript Shiyi Wei Barbara G. Ryder Department of Computer Science Virginia Tech wei, ABSTRACT The JavaScript Blended Analysis Framework is designed to perform a general-purpose, practical combined static/dynamic analysis of JavaScript

Ryder, Barbara G.


Current status of methods for shielding analysis  

SciTech Connect

Current methods used in shielding analysis and recent improvements in those methods are discussed. The status of methods development is discussed based on needs cited at the 1977 International Conference on Reactor Shielding. Additional areas where methods development is needed are discussed.

Engle, W.W.



Methods in quantitative image analysis.  


The main steps of image analysis are image capturing, image storage (compression), correcting imaging defects (e.g. non-uniform illumination, electronic-noise, glare effect), image enhancement, segmentation of objects in the image and image measurements. Digitisation is made by a camera. The most modern types include a frame-grabber, converting the analog-to-digital signal into digital (numerical) information. The numerical information consists of the grey values describing the brightness of every point within the image, named a pixel. The information is stored in bits. Eight bits are summarised in one byte. Therefore, grey values can have a value between 0 and 256 (2(8)). The human eye seems to be quite content with a display of 5-bit images (corresponding to 64 different grey values). In a digitised image, the pixel grey values can vary within regions that are uniform in the original scene: the image is noisy. The noise is mainly manifested in the background of the image. For an optimal discrimination between different objects or features in an image, uniformity of illumination in the whole image is required. These defects can be minimised by shading correction [subtraction of a background (white) image from the original image, pixel per pixel, or division of the original image by the background image]. The brightness of an image represented by its grey values can be analysed for every single pixel or for a group of pixels. The most frequently used pixel-based image descriptors are optical density, integrated optical density, the histogram of the grey values, mean grey value and entropy. The distribution of the grey values existing within an image is one of the most important characteristics of the image. However, the histogram gives no information about the texture of the image. The simplest way to improve the contrast of an image is to expand the brightness scale by spreading the histogram out to the full available range. Rules for transforming the grey value histogram of an existing image (input image) into a new grey value histogram (output image) are most quickly handled by a look-up table (LUT). The histogram of an image can be influenced by gain, offset and gamma of the camera. Gain defines the voltage range, offset defines the reference voltage and gamma the slope of the regression line between the light intensity and the voltage of the camera. A very important descriptor of neighbourhood relations in an image is the co-occurrence matrix. The distance between the pixels (original pixel and its neighbouring pixel) can influence the various parameters calculated from the co-occurrence matrix. The main goals of image enhancement are elimination of surface roughness in an image (smoothing), correction of defects (e.g. noise), extraction of edges, identification of points, strengthening texture elements and improving contrast. In enhancement, two types of operations can be distinguished: pixel-based (point operations) and neighbourhood-based (matrix operations). The most important pixel-based operations are linear stretching of grey values, application of pre-stored LUTs and histogram equalisation. The neighbourhood-based operations work with so-called filters. These are organising elements with an original or initial point in their centre. Filters can be used to accentuate or to suppress specific structures within the image. Filters can work either in the spatial or in the frequency domain. The method used for analysing alterations of grey value intensities in the frequency domain is the Hartley transform. Filter operations in the spatial domain can be based on averaging or ranking the grey values occurring in the organising element. The most important filters, which are usually applied, are the Gaussian filter and the Laplace filter (both averaging filters), and the median filter, the top hat filter and the range operator (all ranking filters). Segmentation of objects is traditionally based on threshold grey values. (AB PMID:8781988

Oberholzer, M; Ostreicher, M; Christen, H; Brühlmann, M



A mixed methods study of food safety knowledge, practices and beliefs in Hispanic families with young children.  


Children are at a higher risk for foodborne illness. The objective of this study was to explore food safety knowledge, beliefs and practices among Hispanic families with young children (?10 years of age) living within a Midwestern state. A convergent mixed methods design collected qualitative and quantitative data in parallel. Food safety knowledge surveys were administered (n?=?90) prior to exploration of beliefs and practices among six focus groups (n?=?52) conducted by bilingual interpreters in community sites in five cities/towns. Descriptive statistics determined knowledge scores and thematic coding unveiled beliefs and practices. Data sets were merged to assess concordance. Participants were female (96%), 35.7 (±7.6) years of age, from Mexico (69%), with the majority having a low education level. Food safety knowledge was low (56%?±?11). Focus group themes were: Ethnic dishes popular, Relating food to illness, Fresh food in home country, Food safety practices, and Face to face learning. Mixed method analysis revealed high self confidence in preparing food safely with low safe food handling knowledge and the presence of some cultural beliefs. On-site Spanish classes and materials were preferred venues for food safety education. Bilingual food safety messaging targeting common ethnic foods and cultural beliefs and practices is indicated to lower the risk of foodborne illness in Hispanic families with young children. PMID:25178898

Stenger, Kristen M; Ritter-Gooder, Paula K; Perry, Christina; Albrecht, Julie A



Practical Methods for Locating Abandoned Wells in Populated Areas  

SciTech Connect

An estimated 12 million wells have been drilled during the 150 years of oil and gas production in the United States. Many old oil and gas fields are now populated areas where the presence of improperly plugged wells may constitute a hazard to residents. Natural gas emissions from wells have forced people from their houses and businesses and have caused explosions that injured or killed people and destroyed property. To mitigate this hazard, wells must be located and properly plugged, a task made more difficult by the presence of houses, businesses, and associated utilities. This paper describes well finding methods conducted by the National Energy Technology Laboratory (NETL) that were effective at two small towns in Wyoming and in a suburb of Pittsburgh, Pennsylvania.

Veloski, G.A.; Hammack, R.W.; Lynn, R.J.



Methods for diagnosis of bile acid malabsorption in clinical practice.  


Altered concentrations of bile acid (BA) in the colon can cause diarrhea or constipation. More than 25% of patients with irritable bowel syndrome with diarrhea or chronic diarrhea in Western countries have BA malabsorption (BAM). As BAM is increasingly recognized, proper diagnostic methods are needed to help direct the most effective course of treatment for the chronic bowel dysfunction. We review the methodologies, advantages, and disadvantages of tools that directly measure BAM: the (14)C-glycocholate breath and stool test, the (75)selenium homotaurocholic acid test (SeHCAT), and measurements of 7 ?-hydroxy-4-cholesten-3-one (C4) and fecal BAs. The (14)C-glycocholate test is laborious and no longer widely used. The (75)SeHCAT has been validated but is not available in the United States. Measurement of serum C4 is a simple and accurate method that can be used for most patients but requires further clinical validation. Assays to quantify fecal BA (total and individual levels) are technically cumbersome and not widely available. Regrettably, none of these tests are routinely available in the United States; assessment of the therapeutic effects of a BA binder is used as a surrogate for diagnosis of BAM. Recent data indicate the advantages to studying fecal excretion of individual BAs and their role in BAM; these could support the use of the fecal BA assay, compared with other tests. Measurement of fecal BA levels could become a routine addition to the measurement of fecal fat in patients with unexplained diarrhea. Availability ultimately determines whether the C4, SeHCAT, or fecal BA test is used; more widespread availability of such tests would enhance clinical management of these patients. PMID:23644387

Vijayvargiya, Priya; Camilleri, Michael; Shin, Andrea; Saenger, Amy



Methods of DNA methylation analysis.  

Technology Transfer Automated Retrieval System (TEKTRAN)

The purpose of this review was to provide guidance for investigators who are new to the field of DNA methylation analysis. Epigenetics is the study of mitotically heritable alterations in gene expression potential that are not mediated by changes in DNA sequence. Recently, it has become clear that n...


A signature analysis method for IC failure analysis  

SciTech Connect

A new method of signature analysis is presented and explained. This method of signature analysis can be based on either experiential knowledge of failure analysis, observed data, or a combination of both. The method can also be used on low numbers of failures or even single failures. It uses the Dempster-Shafer theory to calculate failure mechanism confidence. The model is developed in the paper and an example is given for its use. 9 refs., 5 figs., 9 tabs.

Henderson, C.L.; Soden, J.M.



Text analysis methods, text analysis apparatuses, and articles of manufacture  


Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M



Method of analysis and quality-assurance practices by the U.S. Geological Survey Organic Geochemistry Research Group; determination of geosmin and methylisoborneol in water using solid-phase microextraction and gas chromatography/mass spectrometry  

USGS Publications Warehouse

A method for the determination of two common odor-causing compounds in water, geosmin and 2-methylisoborneol, was modified and verified by the U.S. Geological Survey's Organic Geochemistry Research Group in Lawrence, Kansas. The optimized method involves the extraction of odor-causing compounds from filtered water samples using a divinylbenzene-carboxen-polydimethylsiloxane cross-link coated solid-phase microextraction (SPME) fiber. Detection of the compounds is accomplished using capillary-column gas chromatography/mass spectrometry (GC/MS). Precision and accuracy were demonstrated using reagent-water, surface-water, and ground-water samples. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 35 nanograms per liter ranged from 60 to 123 percent for geosmin and from 90 to 96 percent for 2-methylisoborneol. Method detection limits were 1.9 nanograms per liter for geosmin and 2.0 nanograms per liter for 2-methylisoborneol in 45-milliliter samples. Typically, concentrations of 30 and 10 nanograms per liter of geosmin and 2-methylisoborneol, respectively, can be detected by the general public. The calibration range for the method is equivalent to concentrations from 5 to 100 nanograms per liter without dilution. The method is valuable for acquiring information about the production and fate of these odor-causing compounds in water.

Zimmerman, L.R.; Ziegler, A.C.; Thurman, E.M.



Probabilistic structural analysis by extremum methods  

NASA Technical Reports Server (NTRS)

The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

Nafday, Avinash M.



Measuring solar reflectance - Part II: Review of practical methods  

SciTech Connect

A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23 ], and to within 0.02 for surface slopes up to 12:12 [45 ]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R{sub g,0}{sup *}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R{sub g,0}{sup *} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R{sub g,0}{sup *} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R{sub g,0}{sup *} to within about 0.01. (author)

Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul [Heat Island Group, Environmental Energy Technologies Division, Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States)



Measuring solar reflectance Part II: Review of practical methods  

SciTech Connect

A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23{sup o}], and to within 0.02 for surface slopes up to 12:12 [45{sup o}]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R*{sub g,0}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R*{sub g,0} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R*{sub g,0} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R*{sub g,0} to within about 0.01.

Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul



Qualitative Analysis of Common Definitions for Core Advanced Pharmacy Practice Experiences  

PubMed Central

Objective. To determine how colleges and schools of pharmacy interpreted the Accreditation Council for Pharmacy Education’s (ACPE’s) Standards 2007 definitions for core advanced pharmacy practice experiences (APPEs), and how they differentiated community and institutional practice activities for introductory pharmacy practice experiences (IPPEs) and APPEs. Methods. A cross-sectional, qualitative, thematic analysis was done of survey data obtained from experiential education directors in US colleges and schools of pharmacy. Open-ended responses to invited descriptions of the 4 core APPEs were analyzed using grounded theory to determine common themes. Type of college or school of pharmacy (private vs public) and size of program were compared. Results. Seventy-one schools (72%) with active APPE programs at the time of the survey responded. Lack of strong frequent themes describing specific activities for the acute care/general medicine core APPE indicated that most respondents agreed on the setting (hospital or inpatient) but the student experience remained highly variable. Themes were relatively consistent between public and private institutions, but there were differences across programs of varying size. Conclusion. Inconsistencies existed in how colleges and schools of pharmacy defined the core APPEs as required by ACPE. More specific descriptions of core APPEs would help to standardize the core practice experiences across institutions and provide an opportunity for quality benchmarking. PMID:24954931

Danielson, Jennifer; Weber, Stanley S.



The moderating effect of supply chain role on the relationship between supply chain practices and performance : An empirical analysis  

Microsoft Academic Search

Purpose – The purpose of this paper is to examine the relationships between specific supply chain practices and organizational performance and whether this relationship is moderated by the role that a company assumes in its respective supply chain. Design\\/methodology\\/approach – This paper uses regression analysis and the relative weights method to analyze a set of survey data from respondents within

Lori S. Cook; Daniel R. Heiser; Kaushik Sengupta



Methods to enhance compost practices as an alternative to waste disposal  

SciTech Connect

Creating practices that are ecologically friendly, economically profitable, and ethically sound is a concept that is slowly beginning to unfold in modern society. In developing such practices, the authors challenge long-lived human behavior patterns and environmental management practices. In this paper, they trace the history of human waste production, describe problems associated with such waste, and explore regional coping mechanisms. Composting projects in north central Texas demonstrate new methods for waste disposal. The authors studied projects conducted by municipalities, schools, agricultural organizations, and individual households. These efforts were examined within the context of regional and statewide solid waste plans. They conclude that: (1) regional composting in north central Texas will substantially reduce the waste stream entering landfills; (2) public education is paramount to establishing alternative waste disposal practices; and (3) new practices for compost will catalyze widespread and efficient production.

Stuckey, H.T.; Hudak, P.F.



Microparticle analysis system and method  

NASA Technical Reports Server (NTRS)

A device for analyzing microparticles is provided which includes a chamber with an inlet and an outlet for respectively introducing and dispensing a flowing fluid comprising microparticles, a light source for providing light through the chamber and a photometer for measuring the intensity of light transmitted through individual microparticles. The device further includes an imaging system for acquiring images of the fluid. In some cases, the device may be configured to identify and determine a quantity of the microparticles within the fluid. Consequently, a method for identifying and tracking microparticles in motion is contemplated herein. The method involves flowing a fluid comprising microparticles in laminar motion through a chamber, transmitting light through the fluid, measuring the intensities of the light transmitted through the microparticles, imaging the fluid a plurality of times and comparing at least some of the intensities of light between different images of the fluid.

Morrison, Dennis R. (Inventor)



Aircraft accidents : method of analysis  

NASA Technical Reports Server (NTRS)

The revised report includes the chart for the analysis of aircraft accidents, combining consideration of the immediate causes, underlying causes, and results of accidents, as prepared by the special committee, with a number of the definitions clarified. A brief statement of the organization and work of the special committee and of the Committee on Aircraft Accidents; and statistical tables giving a comparison of the types of accidents and causes of accidents in the military services on the one hand and in civil aviation on the other, together with explanations of some of the important differences noted in these tables.



Portraits of Practice: A Cross-Case Analysis of Two First-Grade Teachers and Their Grouping Practices  

ERIC Educational Resources Information Center

This interpretive study provides a cross-case analysis of the literacy instruction of two first-grade teachers, with a particular focus on their grouping practices. One key finding was the way in which these teachers drew upon a district-advocated approach for instruction--an approach to guided reading articulated by Fountas and Pinnell (1996) in…

Maloch, Beth; Worthy, Jo; Hampton, Angela; Jordan, Michelle; Hungerford-Kresser, Holly; Semingson, Peggy



Methods for Analysis of Outdoor Performance Data (Presentation)  

SciTech Connect

The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted to power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders--utility companies, integrators, investors, and scientists alike. Different methods to determine degradation rates and discrete versus continuous data are presented, and some general best practice methods are outlined. In addition, historical degradation rates and some preliminary analysis with respect to climate are given.

Jordan, D.



A practical method of estimating standard error of age in the fission track dating method  

USGS Publications Warehouse

A first-order approximation formula for the propagation of error in the fission track age equation is given by PA = C[P2s+P2i+P2??-2rPsPi] 1 2, where PA, Ps, Pi and P?? are the percentage error of age, of spontaneous track density, of induced track density, and of neutron dose, respectively, and C is a constant. The correlation, r, between spontaneous are induced track densities is a crucial element in the error analysis, acting generally to improve the standard error of age. In addition, the correlation parameter r is instrumental is specifying the level of neutron dose, a controlled variable, which will minimize the standard error of age. The results from the approximation equation agree closely with the results from an independent statistical model for the propagation of errors in the fission-track dating method. ?? 1979.

Johnson, N.M.; McGee, V.E.; Naeser, C.W.



Scholarship and practice: the contribution of ethnographic research methods to bridging the gap  

Microsoft Academic Search

Information systems research methods need to contribute to the scholarly requirements of the field of knowledge but also need to develop the potential to contribute to the practical requirements of practitioners? knowledge. This leads to possible conflicts in choosing research methods. Argues that the changing world of the IS practitioner is reflected in the changing world of the IS researcher

Lynda J. Harvey; Michael D. Myers



Best Practices in Teaching Statistics and Research Methods in the Behavioral Sciences [with CD-ROM  

ERIC Educational Resources Information Center

This book provides a showcase for "best practices" in teaching statistics and research methods in two- and four-year colleges and universities. A helpful resource for teaching introductory, intermediate, and advanced statistics and/or methods, the book features coverage of: (1) ways to integrate these courses; (2) how to promote ethical conduct;…

Dunn, Dana S., Ed.; Smith, Randolph A., Ed.; Beins, Barney, Ed.



A practical method to achieve prostate gland immobilization and target verification for daily treatment  

Microsoft Academic Search

Purpose: A practical method to achieve prostate immobilization and daily target localization for external beam radiation treatment is described.Methods and Materials: Ten patients who underwent prostate brachytherapy using permanent radioactive source placement were selected for study. To quantify prostate motion both with and without the presence of a specially designed inflatable intrarectal balloon, the computerized tomography–based coordinates of all intraprostatic

Anthony V D’Amico; Judi Manola; Marian Loffredo; Lynn Lopes; Kristopher Nissen; Desmond A O’Farrell; Leah Gordon; Clare M Tempany; Robert A Cormack



Situational Analysis: Centerless Systems and Human Service Practices  

ERIC Educational Resources Information Center

Bronfenbrenner's ecological model is a conceptual framework that continues to contribute to human service practices. In the current article, the author describes the possibilities for practice made intelligible by drawing from this framework. She then explores White's "Web of Praxis" model as an important extension of this approach, and proceeds…

Newbury, Janet



Researching "Practiced Language Policies": Insights from Conversation Analysis  

ERIC Educational Resources Information Center

In language policy research, "policy" has traditionally been conceptualised as a notion separate from that of "practice". In fact, language practices were usually analysed with a view to evaluate whether a policy is being implemented or resisted to. Recently, however, Spolsky in ("Language policy". Cambridge University press, Cambridge, 2004;…

Bonacina-Pugh, Florence



On exploratory factor analysis: a review of recent evidence, an assessment of current practice, and recommendations for future use.  


Exploratory factor analysis (hereafter, factor analysis) is a complex statistical method that is integral to many fields of research. Using factor analysis requires researchers to make several decisions, each of which affects the solutions generated. In this paper, we focus on five major decisions that are made in conducting factor analysis: (i) establishing how large the sample needs to be, (ii) choosing between factor analysis and principal components analysis, (iii) determining the number of factors to retain, (iv) selecting a method of data extraction, and (v) deciding upon the methods of factor rotation. The purpose of this paper is threefold: (i) to review the literature with respect to these five decisions, (ii) to assess current practices in nursing research, and (iii) to offer recommendations for future use. The literature reviews illustrate that factor analysis remains a dynamic field of study, with recent research having practical implications for those who use this statistical method. The assessment was conducted on 54 factor analysis (and principal components analysis) solutions presented in the results sections of 28 papers published in the 2012 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. The main findings from the assessment were that researchers commonly used (a) participants-to-items ratios for determining sample sizes (used for 43% of solutions), (b) principal components analysis (61%) rather than factor analysis (39%), (c) the eigenvalues greater than one rule and screen tests to decide upon the numbers of factors/components to retain (61% and 46%, respectively), (d) principal components analysis and unweighted least squares as methods of data extraction (61% and 19%, respectively), and (e) the Varimax method of rotation (44%). In general, well-established, but out-dated, heuristics and practices informed decision making with respect to the performance of factor analysis in nursing studies. Based on the findings from factor analysis research, it seems likely that the use of such methods may have had a material, adverse effect on the solutions generated. We offer recommendations for future practice with respect to each of the five decisions discussed in this paper. PMID:24183474

Gaskin, Cadeyrn J; Happell, Brenda



Common Goals for the Science and Practice of Behavior Analysis: A Response to Critchfield  

ERIC Educational Resources Information Center

In his scholarly and thoughtful article, "Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis," Critchfield (2011) discussed the science-practice frictions to be expected in any professional organization that attempts to combine these interests. He suggested that the Association for Behavior Analysis

Schneider, Susan M.



Applied Behavior Analysis for Criminal Justice PracticeSome Current Dimensions  

Microsoft Academic Search

In this article, the seven dimensions of applied behavior analysis are related to criminal justice practice. Each of the dimensions is described and examined, and is illustrated by published behavioral literature on crime and delinquency. In addition, three general assumptions of applied behavior analysis are presented: (1) good practice should be good research; (2) behavioral goals, procedures, and effects should

Edward K. Morris



Vibration analysis methods for piping  

NASA Astrophysics Data System (ADS)

Attention is given to flow vibrations in pipe flow induced by singularity points in the piping system. The types of pressure fluctuations induced by flow singularities are examined, including the intense wideband fluctuations immediately downstream of the singularity and the acoustic fluctuations encountered in the remainder of the circuit, and a theory of noise generation by unsteady flow in internal acoustics is developed. The response of the piping systems to the pressure fluctuations thus generated is considered, and the calculation of the modal characteristics of piping containing a dense fluid in order to obtain the system transfer function is discussed. The TEDEL program, which calculates the vibratory response of a structure composed of straight and curved pipes with variable mechanical characteristics forming a three-dimensional network by a finite element method, is then presented, and calculations of fluid-structural coupling in tubular networks are illustrated.

Gibert, R. J.



Lateralization of speech production using verbal\\/manual dual tasks: meta-analysis of sex differences and practice effects  

Microsoft Academic Search

The present paper reviews the findings of 30 years of verbal\\/manual dual task studies, the method most commonly used to assess lateralization of speech production in non-clinical samples. Meta-analysis of 64 results revealed that both the type of manual task used and the nature of practice that is given influence the size of the laterality effect. A meta-analysis of 36

S. E. Medland; G. Geffen; K. McFarland



Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture  


Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

Sanfilippo, Antonio P. (Richland, WA); Cowell, Andrew J. (Kennewick, WA); Gregory, Michelle L. (Richland, WA); Baddeley, Robert L. (Richland, WA); Paulson, Patrick R. (Pasco, WA); Tratz, Stephen C. (Richland, WA); Hohimer, Ryan E. (West Richland, WA)



Application of quality-improvement methods in a community practice: the Sandhills Pediatrics Asthma Initiative.  


This case study demonstrates the use of quality improvement methods to improve asthma care in a busy community practice. The practice used disease-management strategies, such as population identification, self-management education, and performance measurement and feedback. The practice then applied several practice-based quality improvement methods, such as PDSA cycles, to improve care. From 1998 to 2003, process measures, such as staging of asthmatics, use of long-term control medications, use of peak flow meters and spacers, and use of action plans, improved. There was also a substantial decrease in emergency department use and hospitalizations among patients with asthma. Although there have been several studies demonstrating the efficacy of disease management strategies, most lack generalizability to community practices. Often, interventions are so intensive and cumbersome, that they are unlikely to be replicated in primary care setting. Researchers have been unable to determine which components of the interventions are most effective and replicable. Furthermore, many studies of disease management strategies enroll participants who lack the co-morbidities seen in community practice. There are also few studies of disadvantaged populations that face other barriers to care, such as lack of transportation, poor access to specialists, and medical illiteracy. In this case study, there were several unique factors that enabled the practice to improve care for this population. The AccessCare case manager who worked with the practice not only provided data and feedback to the practice team, but also served as an improvement "coach," often pushing the team and facilitating many of the improvement efforts. AccessCare's approach is in contrast to many of the commercial disease management companies' "carve out" models that do not sufficiently involve providers or practices in their interventions. The other necessary ingredient for success in this project was organizational leadership and support. The leaders of the practice saw beyond the usual metrics of patient visit counts and relative value units (RVUs) to embrace the concept of population health: the notion that practices are not only responsible for providing acute, episodic care in the office, but also for improving health outcomes in the community in which they serve. Other important factors included ensuring a basic agreement among providers on the need for improvement and frequent communication about the goals of the project. Although the champions of the project tried to minimize formal meeting time, there was frequent informal communication between team members. In the future, there is a need to develop other approaches to stimulate these endeavors in community practices, such as "pay for performance" programs, continuing education credit, and tying maintenance of board certification to quality improvement initiatives. PMID:16130947

Wroth, Thomas H; Boals, Joseph C



A global analysis method for astrolabe observations  

NASA Astrophysics Data System (ADS)

In a previous paper (Chollet & Najid 1992), we gave the general principles of a new global method to analyze the astrolabe observations. The fundamental equation was obtained from the classical one in which the corrections to the star positions at the observational epoch are replaced by developments that contain the corrections to the star positions for the epoch of the catalogue, the proper motions, as well as the corrections to the precession and nutation constants. This computation gives us a new equation in which the coefficients contain only two variable parameters the azimuth and the sidereal time. The method proposed here consists in regarding the whole programme of star observations as only one group. All the possible values of the azimuth and also of the sidereal time are obtained and, in this case the column vectors of the coefficients are quite orthogonal, and the matirx of the normal equations is practically diagonal. The only problem which remains, is due to the variations of the apparent position of the station. These effects are removed by using the parameters of the Earth rotation given by the Bureau International de l'Huere (BIH) and connected to the International Earth Rotation Service (IERS) system by the Central Bureau of the IERS. Now, the new corrected unknowns are related to the mean position of the astrolabe, in the IERS system. The method to obtain absolute declinations follows the form of the preceding relations. The same error multiplied by different but known constants, affects the declination of each star, but also the latitude and zenith distance determinations. From this results, it is possible to find the well known result (Krejnin 1968) concerning the determinations of absolute declinations. But the comparison between the direct measurement and the result obtained from stellar observations will also give the systematic error in declination and latitude. The last important result is that the corrections to the precession and nutation constants appear in the equations without any perturbation due to the catalogue errors. This fact was seen in the past (Guinot 1970) but not used. The method given here does not use sophisticated methods to analyse the observational data obtained by astrolabes. Our purpose was preferably to combine and to correct the data. This method was elaborated to analyze the observations of the future automatic astrolabes. It has been tested on the series of observations obtained at Paris Observatory. analysis using the future Hipparcos catalogue.

Chollet, F.



Analysis of nonstandard and home-made explosives and post-blast residues in forensic practice  

NASA Astrophysics Data System (ADS)

Nonstandard and home-made explosives may constitute a considerable threat and as well as a potential material for terrorist activities. Mobile analytical devices, particularly Raman, or also FTIR spectrometers are used for the initial detection. Various sorts of phlegmatizers (moderants) to decrease sensitivity of explosives were tested, some kinds of low viscosity lubricants yielded very good results. If the character of the substance allows it, phlegmatized samples are taken in the amount of approx.0.3g for a laboratory analysis. Various separation methods and methods of concentrations of samples from post-blast scenes were tested. A wide range of methods is used for the laboratory analysis. XRD techniques capable of a direct phase identification of the crystalline substance, namely in mixtures, have highly proved themselves in practice for inorganic and organic phases. SEM-EDS/WDS methods are standardly employed for the inorganic phase. In analysing post-blast residues, there are very important techniques allowing analysis at the level of separate particles, not the overall composition in a mixed sample.

Kotrlý, Marek; Turková, Ivana



How Typewriters Changed Correspondence: An Analysis of Prescription and Practice.  

ERIC Educational Resources Information Center

Notes changes in the visual organization of correspondence brought about by the typewriter. Discusses the development of these changes, drawing examples both from the prescriptions for and the practice of commercial correspondence. (FL)

Walker, Sue



Airbreathing hypersonic vehicle design and analysis methods  

NASA Technical Reports Server (NTRS)

The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.



Laboratory theory and methods for sediment analysis  

USGS Publications Warehouse

The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.

Guy, Harold P.



A mixed-methods study of research dissemination across practice-based research networks.  


Practice-based research networks may be expanding beyond research into rapid learning systems. This mixed-methods study uses Agency for Healthcare Research and Quality registry data to identify networks currently engaged in dissemination of research findings and to select a sample to participate in qualitative semistructured interviews. An adapted Diffusion of Innovations framework was used to organize concepts by characteristics of networks, dissemination activities, and mechanisms for rapid learning. Six regional networks provided detailed information about dissemination strategies, organizational context, role of practice-based research network, member involvement, and practice incentives. Strategies compatible with current practices and learning innovations that generate observable improvements may increase effectiveness of rapid learning approaches. PMID:24594566

Lipman, Paula Darby; Lange, Carol J; Cohen, Rachel A; Peterson, Kevin A



Application of Stacking Technique in ANA: Method and Practice with PKU Seismological Array  

NASA Astrophysics Data System (ADS)

Cross correlation of ambient noise records is now routinely used to get dispersion curve and then do seismic tomography; however little attention has been paid to array techniques. We will present a spacial-stacking method to get high resolution dispersion curves and show practices with the observation data of PKU seismological array. Experiential Green Functions are generally obtained by correlation between two stations, and then the dispersion curves are obtained from the analysis of FTAN. Popular method to get high resolution dispersion curves is using long time records. At the same time, if we want to get effectual signal, the distance between the two stations must be at least 3 times of the longest wavelength. So we need both long time records and appropriate spaced stations. Now we use a new method, special-stacking, which allows shorter observation period and utilizes observations of a group of closely distributed stations to get fine dispersion curves. We correlate observations of every station in the station group with those of a far station, and then stack them together. However we cannot just simply stack them unless the stations in the station group at a circle, of which the center is the far station owing to dispersion characteristics of the Rayleigh waves. Thus we do antidispersion on the observation data of every station in the array, then do stacking. We test the method using the theoretical seismic surface wave records which obtained by qseis06 compiled by Rongjiang Wang both with and without noise. For the cases of three imaginary stations (distance is 1 degree) have the same underground structure and without noise, result is that the center station had the same dispersion with and without spacial-stacking. Then we add noise to the theoretical records. The center station's dispersion curves obtained by our method are much closer to the dispersion curve without noise than contaminated ones. We can see that our method has improved the resolution of the dispersion curve. Then we use the real data from PKU array whose interval is about 10 km and the permanent stations of IRIS which is far (more than 200 km) from PKU array, to test the method. Firstly, we compare the stacked correlation results of the three consecutive stations with uncorrelated ones, finding the resolution of the dispersion curve of the former is better. Secondly, we compare the stacked results with the results of center station's traditional correlation in one year, and find the two fit very well.

Liu, J.; Tang, Y.; Ning, J.; Chen, Y. J.



A spatial analysis of the expanding roles of nurses in general practice  

PubMed Central

Background Changes to the workforce and organisation of general practice are occurring rapidly in response to the Australian health care reform agenda, and the changing nature of the medical profession. In particular, the last five years has seen the rapid introduction and expansion of a nursing workforce in Australian general practices. This potentially creates pressures on current infrastructure in general practice. Method This study used a mixed methods, ‘rapid appraisal’ approach involving observation, photographs, and interviews. Results Nurses utilise space differently to GPs, and this is part of the diversity they bring to the general practice environment. At the same time their roles are partly shaped by the ways space is constructed in general practices. Conclusion The fluidity of nursing roles in general practice suggests that nurses require a versatile space in which to maximize their role and contribution to the general practice team. PMID:22870933



Mediation Analysis: A Retrospective Snapshot of Practice and More Recent Directions  

PubMed Central

R. Baron and D. A. Kenny’s (1986) paper introducing mediation analysis has been cited over 9,000 times, but concerns have been expressed about how this method is used. The authors review past and recent methodological literature and make recommendations for how to address 3 main issues: association, temporal order, and the no omitted variables assumption. The authors briefly visit the topics of reliability and the confirmatory–exploratory distinction. In addition, to provide a sense of the extent to which the earlier literature had been absorbed into practice, the authors examined a sample of 50 articles from 2002 citing R. Baron and D. A. Kenny and containing at least 1 mediation analysis via ordinary least squares regression. A substantial proportion of these articles included problematic reporting; as of 2002, there appeared to be room for improvement in conducting such mediation analyses. Future literature reviews will demonstrate the extent to which the situation has improved. PMID:19350833

Gelfand, Lois A.; Mensinger, Janell L.; Tenhave, Thomas



Education Policy as a Practice of Power: Theoretical Tools, Ethnographic Methods, Democratic Options  

ERIC Educational Resources Information Center

This article outlines some theoretical and methodological parameters of a critical practice approach to policy. The article discusses the origins of this approach, how it can be uniquely adapted to educational analysis, and why it matters--not only for scholarly interpretation but also for the democratization of policy processes as well. Key to…

Levinson, Bradley A. U.; Sutton, Margaret; Winstead, Teresa



Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues  

SciTech Connect

This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

Ronald Laurids Boring



Relating Actor Analysis Methods to Policy Problems  

Microsoft Academic Search

For a policy analyst the policy problem is the starting point for the policy analysis process. During this process the policy analyst structures the policy problem and makes a choice for an appropriate set of methods or techniques to analyze the problem (Goeller 1984). The methods of the policy analyst are often referred to as the toolbox or toolkit of

T. E. Van der Lei



A Comparative Analysis of Ethnomedicinal Practices for Treating Gastrointestinal Disorders Used by Communities Living in Three National Parks (Korea)  

PubMed Central

The purpose of this study is to comparatively analyze the ethnomedicinal practices on gastrointestinal disorders within communities in Jirisan National Park, Gayasan National Park, and Hallasan National Park of Korea. Data was collected through participant observations and indepth interviews with semistructured questionnaires. Methods for comparative analysis were accomplished using the informant consensus factor, fidelity level, and internetwork analysis. A total of 490 ethnomedicinal practices recorded from the communities were classified into 110 families, 176 genera, and 220 species that included plants, animals, fungi, and alga. The informant consensus factor values in the disorder categories were enteritis, and gastralgia (1.0), followed by indigestion (0.94), constipation (0.93), and abdominal pain and gastroenteric trouble (0.92). In terms of fidelity levels, 71 plant species showed fidelity levels of 100%. The internetwork analysis between disorders and all medicinal species are grouped in the center by the four categories of indigestion, diarrhea, abdominal pain, and gastroenteric trouble, respectively. Regarding the research method of this study, the comparative analysis methods will contribute to the availability of orally transmitted ethnomedicinal knowledge. Among the methods of analysis, the use of internetwork analysis as a tool for analysis in this study provides imperative internetwork maps between gastrointestinal disorders and medicinal species. PMID:25202330

Kim, Hyun; Song, Mi-Jang; Brian, Heldenbrand; Choi, Kyoungho



Method of analysis and quality-assurance practices for determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry at the U.S. Geological Survey California District Organic Chemistry Laboratory, 1996-99  

USGS Publications Warehouse

A method of analysis and quality-assurance practices were developed to study the fate and transport of pesticides in the San Francisco Bay-Estuary by the U.S. Geological Survey. Water samples were filtered to remove suspended-particulate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide and the pesticides were eluted with three cartridge volumes of hexane:diethyl ether (1:1) solution. The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for pesticides ranged from 0.002 to 0.025 microgram per liter for 1-liter samples. Recoveries ranged from 44 to 140 percent for 25 pesticides in samples of organic-free reagent water and Sacramento-San Joaquin Delta and Suisun Bay water fortified at 0.05 and 0.50 microgram per liter. The estimated holding time for pesticides after extraction on C-8 solid-phase extraction cartridges ranged from 10 to 257 days.

Crepeau, Kathryn L.; Baker, Lucian M.; Kuivila, Kathryn M.




Technology Transfer Automated Retrieval System (TEKTRAN)

We present a practical evaluation framework for analysis of two complex, process-based agricultural system models, WEPP and RZWQM. The evaluation framework combines sensitivity analysis and the uncertainty analysis techniques of first order error analysis (FOA) and Monte Carlo simulation with Latin ...


Hybrid least squares multivariate spectral analysis methods  


A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

Haaland, David M. (Albuquerque, NM)



Signal analysis using the maximum entropy method  

NASA Astrophysics Data System (ADS)

The analysis of discrete series is explained using the maximum entropy method in order to solve data processing problems in the aeroelastic study of wind turbines. Discrete Fourier spectrum and maximum entropy spectrum are compared. This method searches for an autoregressive model for a series containing a maximum of information about entropy. The autoregressive model is outlined and the notions 'information' and 'entropy' in signal analysis are defined. Energy and phase spectrum construction, starting from an autoregressive model, is described, with examples. The maximum entropy method is valuable where Fourier transformation techniques hardly work or fail.

Kuik, W.



Practical guidelines to select and scale earthquake records for nonlinear response history analysis of structures  

USGS Publications Warehouse

Earthquake engineering practice is increasingly using nonlinear response history analysis (RHA) to demonstrate performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. Presented herein is a modal-pushover-based scaling (MPS) method to scale ground motions for use in nonlinear RHA of buildings and bridges. In the MPS method, the ground motions are scaled to match (to a specified tolerance) a target value of the inelastic deformation of the first-'mode' inelastic single-degree-of-freedom (SDF) system whose properties are determined by first-'mode' pushover analysis. Appropriate for first-?mode? dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-'mode' SDF system in selecting a subset of the scaled ground motions. Based on results presented for two bridges, covering single- and multi-span 'ordinary standard' bridge types, and six buildings, covering low-, mid-, and tall building types in California, the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.

Kalkan, Erol; Chopra, Anil K.



Moving environmental DNA methods from concept to practice for monitoring aquatic macroorganisms  

USGS Publications Warehouse

The discovery that macroorganisms can be detected from their environmental DNA (eDNA) in aquatic systems has immense potential for the conservation of biological diversity. This special issue contains 11 papers that review and advance the field of eDNA detection of vertebrates and other macroorganisms, including studies of eDNA production, transport, and degradation; sample collection and processing to maximize detection rates; and applications of eDNA for conservation using citizen scientists. This body of work is an important contribution to the ongoing efforts to take eDNA detection of macroorganisms from technical breakthrough to established, reliable method that can be used in survey, monitoring, and research applications worldwide. While the rapid advances in this field are remarkable, important challenges remain, including consensus on best practices for collection and analysis, understanding of eDNA diffusion and transport, and avoidance of inhibition in sample collection and processing. Nonetheless, as demonstrated in this special issue, eDNA techniques for research and monitoring are beginning to realize their potential for contributing to the conservation of biodiversity globally.

Goldberg, Caren S.; Strickler, Katherine M.; Pilliod, David S.



Impact of pedagogical method on Brazilian dental students' waste management practice.  


The purpose of this study was to conduct a qualitative analysis of waste management practices among a group of Brazilian dental students (n=64) before and after implementing two different pedagogical methods: 1) the students attended a two-hour lecture based on World Health Organization standards; and 2) the students applied the lessons learned in an organized group setting aimed toward raising their awareness about socioenvironmental issues related to waste. All eligible students participated, and the students' learning was evaluated through their answers to a series of essay questions, which were quantitatively measured. Afterwards, the impact of the pedagogical approaches was compared by means of qualitative categorization of wastes generated in clinical activities. Waste categorization was performed for a period of eight consecutive days, both before and thirty days after the pedagogical strategies. In the written evaluation, 80 to 90 percent of the students' answers were correct. The qualitative assessment revealed a high frequency of incorrect waste disposal with a significant increase of incorrect disposal inside general and infectious waste containers (p<0.05). Although the students' theoretical learning improved, it was not enough to change behaviors established by cultural values or to encourage the students to adequately segregate and package waste material. PMID:25362694

Victorelli, Gabriela; Flório, Flávia Martão; Ramacciato, Juliana Cama; Motta, Rogério Heládio Lopes; de Souza Fonseca Silva, Almenara



Optimisation of Lime-Soda process parameters for reduction of hardness in aqua-hatchery practices using Taguchi methods.  


This paper presents the optimisation of Lime-Soda process parameters for the reduction of hardness in aqua-hatchery practices in the context of M. rosenbergii. The fresh water in the development of fisheries needs to be of suitable quality. Lack of desirable quality in available fresh water is generally the confronting restraint. On the Indian subcontinent, groundwater is the only source of raw water, having varying degree of hardness and thus is unsuitable for the fresh water prawn hatchery practices (M. rosenbergii). In order to make use of hard water in the context of aqua-hatchery, Lime-Soda process has been recommended. The efficacy of the various process parameters like lime, soda ash and detention time, on the reduction of hardness needs to be examined. This paper proposes to determine the parameter settings for the CIFE well water, which is pretty hard by using Taguchi experimental design method. Orthogonal Arrays of Taguchi, Signal-to-Noise Ratio, the analysis of variance (ANOVA) have been applied to determine their dosage and analysed for their effect on hardness reduction. The tests carried out with optimal levels of Lime-Soda process parameters confirmed the efficacy of the Taguchi optimisation method. Emphasis has been placed on optimisation of chemical doses required to reduce the total hardness using Taguchi method and ANOVA, to suit the available raw water quality for aqua-hatchery practices, especially for fresh water prawn M. rosenbergii. PMID:24749379

Yavalkar, S P; Bhole, A G; Babu, P V Vijay; Prakash, Chandra



Practical Implementation of New Particle Tracking Method to the Real Field of Groundwater Flow and Transport  

PubMed Central

Abstract In articles published in 2009 and 2010, Suk and Yeh reported the development of an accurate and efficient particle tracking algorithm for simulating a path line under complicated unsteady flow conditions, using a range of elements within finite elements in multidimensions. Here two examples, an aquifer storage and recovery (ASR) example and a landfill leachate migration example, are examined to enhance the practical implementation of the proposed particle tracking method, known as Suk's method, to a real field of groundwater flow and transport. Results obtained by Suk's method are compared with those obtained by Pollock's method. Suk's method produces superior tracking accuracy, which suggests that Suk's method can describe more accurately various advection-dominated transport problems in a real field than existing popular particle tracking methods, such as Pollock's method. To illustrate the wide and practical applicability of Suk's method to random-walk particle tracking (RWPT), the original RWPT has been modified to incorporate Suk's method. Performance of the modified RWPT using Suk's method is compared with the original RWPT scheme by examining the concentration distributions obtained by the modified RWPT and the original RWPT under complicated transient flow systems. PMID:22476629

Suk, Heejun



Self-Assessment Methods in Writing Instruction: A Conceptual Framework, Successful Practices and Essential Strategies  

ERIC Educational Resources Information Center

Student writing achievement is essential to lifelong learner success, but supporting writing can be challenging for teachers. Several large-scale analyses of publications on writing have called for further study of instructional methods, as the current literature does not sufficiently address the need to support best teaching practices.…

Nielsen, Kristen



Communities of Practice: A Research Paradigm for the Mixed Methods Approach  

ERIC Educational Resources Information Center

The mixed methods approach has emerged as a "third paradigm" for social research. It has developed a platform of ideas and practices that are credible and distinctive and that mark the approach out as a viable alternative to quantitative and qualitative paradigms. However, there are also a number of variations and inconsistencies within the mixed…

Denscombe, Martyn



What Informs Practice and What Is Valued in Corporate Instructional Design? A Mixed Methods Study  

ERIC Educational Resources Information Center

This study used a two-phased explanatory mixed-methods design to explore in-depth what factors are perceived by Instructional Design and Technology (IDT) professionals as impacting instructional design practice, how these factors are valued in the field, and what differences in perspectives exist between IDT managers and non-managers. For phase 1…

Thompson-Sellers, Ingrid N.



A practical application of the finite element method and parallel computing to modeling UTP cable  

Microsoft Academic Search

This paper presents a practical application of the finite element method (FEM) in modeling the unshielded twisted pair (UTP) cable. The FEM was applied with a massively parallel processing (MPP) IBM RS\\/6000 SP computer. Three programs are presented: one for preprocessing, one for processing, and the other for post-processing.

J. E. F. Freitas; P. R. G. Alves




Microsoft Academic Search

Leaching of applied agricultural chemicals is a process which must be fully understood if we are to reduce agriculture's impact on the environment. A Best Management Practices (BMP) project was initiated in 1989 near Oakes, ND. A primary objective of the study was to develop and evaluate methods for measuring leachate losses from a corn (Zea mays L.) root zone.

Nathan E. Derby; Raymond E. Knighton; Dean D. Steele; Bruce R. Montgomery


Connecting Practice, Theory and Method: Supporting Professional Doctoral Students in Developing Conceptual Frameworks  

ERIC Educational Resources Information Center

From an instrumental view, conceptual frameworks that are carefully assembled from existing literature in Educational Technology and related disciplines can help students structure all aspects of inquiry. In this article we detail how the development of a conceptual framework that connects theory, practice and method is scaffolded and facilitated…

Kumar, Swapna; Antonenko, Pavlo



Trusting the Method: An Ethnographic Search for Policy in Practice in an Australian Primary School  

ERIC Educational Resources Information Center

The apparent simplicity of ethnographic methods--studying people in their normal life setting, going beyond what might be said in surveys and interviews to observe everyday practices--is deceptive. Anthropological knowledge is gained through fieldwork and through pursuing a reflexive flexible approach. This study carried out in a non-government…

Robinson, Sarah



Application of a practical method for the isocenter point in vivo dosimetry by a transit signal  

Microsoft Academic Search

This work reports the results of the application of a practical method to determine the in vivo dose at the isocenter point, Diso, of brain thorax and pelvic treatments using a transit signal St. The use of a stable detector for the measurement of the signal St (obtained by the x-ray beam transmitted through the patient) reduces many of the

Angelo Piermattei; Andrea Fidanzio; Luigi Azario; Luca Grimaldi; Guido D'Onofrio; Savino Cilla; Gerardina Stimato; Diego Gaudino; Sara Ramella; Rolando D'Angelillo; Francesco Cellini; Lucio Trodella; Aniello Russo; Luciano Iadanza; Sergio Zucca; Vincenzo Fusco; Nicola Di Napoli; Maria Antonietta Gambacorta; Mario Balducci; Numa Cellini; Francesco Deodato; Gabriella Macchia; Alessio G. Morganti



A practical method for calibrating a coaxial noise source with a waveguide standard  

Microsoft Academic Search

A practical method is given for calibrating a coaxial noise source with a waveguide standard without directly evaluating the electrical characteristics of the coax-waveguide adaptor. It is shown that the measurement equation can be simplified when one of the standards is of the room-temperature type and the effective temperature of the radiometer when looking inside from the input port is

Yoshihiko Kato; Ichiro Yokoshima




Microsoft Academic Search

Recent evaluations of neutron cross section covariances in the resolved resonance region reveal the need for further research in this area. Major issues include declining uncertainties in multigroup representations and proper treatment of scattering radius uncertainty. To address these issues, the present work introduces a practical method based on kernel approximation using resonance parameter uncertainties from the Atlas of Neutron

Y. S. Cho; P. Oblozinsky; S. F. Mughabghab; C. M. Mattoon; M. Herman



Practicality-Based Probabilistic Roadmaps Method Jing Yang, Patrick Dymond and Michael Jenkin  

E-print Network

: {jyang, jenkin, dymond} Abstract--Probabilistic roadmap methods (PRMs) are a commonly used approach to path planning problems in a high- dimensional search space. Although PRMs can often find a variant of PRMs that addresses the practicality problem of the paths found by the planner. A simple

Jenkin, Michael R. M.


Passive Sampling Methods for Contaminated Sediments: Practical Guidance for Selection, Calibration, and Implementation  

EPA Science Inventory

This article provides practical guidance on the use of passive sampling methods(PSMs) that target the freely dissolved concentration (Cfree) for improved exposure assessment of hydrophobic organic chemicals in sediments. Primary considerations for selecting a PSM for a specific a...


Comparative analysis of the spatial analysis methods for hotspot identification.  


Spatial analysis technique has been introduced as an innovative approach for hazardous road segments identification (HRSI). In this study, the performance of two spatial analysis methods and four conventional methods for HRSI was compared against three quantitative evaluation criteria. The spatial analysis methods considered in this study include the local spatial autocorrelation method and the kernel density estimation (KDE) method. It was found that the empirical Bayesian (EB) method and the KDE method outperformed other HRSI approaches. By transferring the kernel density function into a form that was analogous to the form of the EB function, we further proved that the KDE method can eventually be considered a simplified version of the EB method in which crashes reported at neighboring spatial units are used as the reference population for estimating the EB-adjusted crashes. Theoretically, the KDE method may outperform the EB method in HRSI when the neighboring spatial units provide more useful information on the expected crash frequency than a safety performance function does. PMID:24530515

Yu, Hao; Liu, Pan; Chen, Jun; Wang, Hao



A Practical Test Method for Mode I Fracture Toughness of Adhesive Joints with Dissimilar Substrates  

SciTech Connect

A practical test method for determining the mode I fracture toughness of adhesive joints with dissimilar substrates will be discussed. The test method is based on the familiar Double Cantilever Beam (DCB) specimen geometry, but overcomes limitations in existing techniques that preclude their use when testing joints with dissimilar substrates. The test method is applicable to adhesive joints where the two bonded substrates have different flexural rigidities due to geometric and/or material considerations. Two specific features discussed are the use of backing beams to prevent substrate damage and a compliance matching scheme to achieve symmetric loading conditions. The procedure is demonstrated on a modified DCB specimen comprised of SRIM composite and thin-section, e-coat steel substrates bonded with an epoxy adhesive. Results indicate that the test method provides a practical means of characterizing the mode I fracture toughness of joints with dissimilar substrates.

Boeman, R.G.; Erdman, D.L.; Klett, L.B.; Lomax, R.D.



Practical method for evaluating the sound field radiated from a waveguide.  


This letter presents a simple and practical method for evaluating the sound field radiated from a waveguide. By using the proposed method, detailed information about the radiated sound field can be obtained by measuring the sound field in the mouth of the baffled waveguide. To examine this method's effectiveness, the radiated sound pressure distribution in space was first evaluated by using the proposed method, and then it was measured directly for comparison. Experiments using two different waveguides showed good agreement between the evaluated and the measured radiated sound pressure distributions. PMID:25618097

Feng, Xuelei; Shen, Yong; Chen, Simiao; Zhao, Ye



A Practical Guide to Data Analysis for Physical Science Students  

Microsoft Academic Search

This textbook is intended for undergraduates who are carrying out laboratory experiments in the physical sciences for the first time. It is a practical guide on how to analyze data and estimate errors. The necessary formulas for performing calculations are given, and the ideas behind them are explained, although this is not a formal text on statistics. Specific examples are

Louis Lyons



Professional Learning in Rural Practice: A Sociomaterial Analysis  

ERIC Educational Resources Information Center

Purpose: This paper aims to examine the professional learning of rural police officers. Design/methodology/approach: This qualitative case study involved interviews and focus groups with 34 police officers in Northern Scotland. The interviews and focus groups were transcribed and analysed, drawing on practice-based and sociomaterial learning…

Slade, Bonnie



Analysis of factors influencing project cost estimating practice  

Microsoft Academic Search

Although extensive research has been undertaken on factors influencing the decision to tender and mark-up and tender price determination for construction projects, very little of this research contains information appropriate to the factors involved in costing construction projects. The object of this study was to gain an understanding of the factors influencing contractors' cost estimating practice. This was achieved through

Akintola Akintoye



Comparison and cost analysis of drinking water quality monitoring requirements versus practice in seven developing countries.  


Drinking water quality monitoring programs aim to support provision of safe drinking water by informing water quality management. Little evidence or guidance exists on best monitoring practices for low resource settings. Lack of financial, human, and technological resources reduce a country's ability to monitor water supply. Monitoring activities were characterized in Cambodia, Colombia, India (three states), Jordan, Peru, South Africa, and Uganda according to water sector responsibilities, monitoring approaches, and marginal cost. The seven study countries were selected to represent a range of low resource settings. The focus was on monitoring of microbiological parameters, such as E. coli, coliforms, and H2S-producing microorganisms. Data collection involved qualitative and quantitative methods. Across seven study countries, few distinct approaches to monitoring were observed, and in all but one country all monitoring relied on fixed laboratories for sample analysis. Compliance with monitoring requirements was highest for operational monitoring of large water supplies in urban areas. Sample transport and labor for sample collection and analysis together constitute approximately 75% of marginal costs, which exclude capital costs. There is potential for substantive optimization of monitoring programs by considering field-based testing and by fundamentally reconsidering monitoring approaches for non-piped supplies. This is the first study to look quantitatively at water quality monitoring practices in multiple developing countries. PMID:25046632

Crocker, Jonny; Bartram, Jamie



Determinants of the range of drugs prescribed in general practice: a cross-sectional analysis  

PubMed Central

Background Current health policies assume that prescribing is more efficient and rational when general practitioners (GPs) work with a formulary or restricted drugs lists and thus with a limited range of drugs. Therefore we studied determinants of the range of drugs prescribed by general practitioners, distinguishing general GP-characteristics, characteristics of the practice setting, characteristics of the patient population and information sources used by GPs. Methods Secondary analysis was carried out on data from the Second Dutch Survey in General Practice. Data were available for 138 GPs working in 93 practices. ATC-coded prescription data from electronic medical records, census data and data from GP/practice questionnaires were analyzed with multilevel techniques. Results The average GP writes prescriptions for 233 different drugs, i.e. 30% of the available drugs on the market within one year. There is considerable variation between ATC main groups and subgroups and between GPs. GPs with larger patient lists, GPs with higher prescribing volumes and GPs who frequently receive representatives from the pharmaceutical industry have a broader range when controlled for other variables. Conclusion The range of drugs prescribed is a useful instrument for analysing GPs' prescribing behaviour. It shows both variation between GPs and between therapeutic groups. Statistically significant relationships found were in line with the hypotheses formulated, like the one concerning the influence of the industry. Further research should be done into the relationship between the range and quality of prescribing and the reasons why some GPs prescribe a greater number of different drugs than others. PMID:17711593

de Bakker, Dinny H; Coffie, Dayline SV; Heerdink, Eibert R; van Dijk, Liset; Groenewegen, Peter P



Empirical Analysis of Green Supply Chain Management Practices in Indian Automobile Industry  

NASA Astrophysics Data System (ADS)

Environmental sustainability and green environmental issues have an increasing popularity among researchers and supply chain practitioners. An attempt has been made to identify and empirically analyze green supply chain management (GSCM) practices in Indian automobile industry. Six main GSCM practices (having 37 sub practices) and four expected performance outcomes (having 16 performances) have been identified by implementing GSCM practices from literature review. Questionnaire based survey has been made to validate these practices and performance outcomes. 123 complete questionnaires were collected from Indian automobile organizations and used for empirical analysis of GSCM practices in Indian automobile industry. Descriptive statistics have been used to know current implementation status of GSCM practices in Indian automobile industry and multiple regression analysis has been carried out to know the impact on expected organizational performance outcomes by current GSCM practices adopted by Indian automobile industry. The results of study suggested that environmental, economic, social and operational performances improve with the implementation of GSCM practices. This paper may play an important role to understand various GSCM implementation issues and help practicing managers to improve their performances in the supply chain.

Luthra, S.; Garg, D.; Haleem, A.



Reporting Practices in Confirmatory Factor Analysis: An Overview and Some Recommendations  

ERIC Educational Resources Information Center

Reporting practices in 194 confirmatory factor analysis studies (1,409 factor models) published in American Psychological Association journals from 1998 to 2006 were reviewed and compared with established reporting guidelines. Three research questions were addressed: (a) how do actual reporting practices compare with published guidelines? (b) how…

Jackson, Dennis L.; Gillaspy, J. Arthur, Jr.; Purc-Stephenson, Rebecca



Infant-Feeding Practices Among African American Women: Social-Ecological Analysis and Implications for Practice.  


Despite extensive evidence supporting the health benefits of breastfeeding, significant disparities exist between rates of breastfeeding among African American women and women of other races. Increasing rates of breastfeeding among African American women can contribute to the improved health of the African American population by decreasing rates of infant mortality and disease and by enhancing cognitive development. Additionally, higher rates of breastfeeding among African American women could foster maternal-child bonding and could contribute to stronger families, healthier relationships, and emotionally healthier adults. The purpose of this article is twofold: (a) to use the social-ecological model to explore the personal, socioeconomic, psychosocial, and cultural factors that affect the infant feeding decision-making processes of African American women and (b) to discuss the implications of these findings for clinical practice and research to eliminate current disparities in rates of breastfeeding. PMID:24810518

Reeves, Elizabeth A; Woods-Giscombé, Cheryl L



Coupled inlet-engine dynamic analysis method  

NASA Astrophysics Data System (ADS)

A new method is presented for unsteady analysis of turbine engine propulsion systems. The method is a coupled analysis of the inlet-compressor combination with multidimensional inlet capability. The method incorporates inviscid, unsteady, computational fluid dynamics in the inlet using an unstructured numerical grid and a one-dimensional dynamic turbomachinery model. The present application of the method is an axisymmetric mixed compression inlet with an eight stage axial compressor of a turbojet engine. The inlet simulation includes geometric details not previously included in strictly one-dimensional analyses. Simulation of two events are compared to experimental data for Mach 2.5 freestream conditions. The first event is an inlet unstart triggered by bypass flow throttling, resulting in a subsequent compressor stall. The second event is a compressor stall triggered by nozzle throttling resulting in a subsequent inlet unstart and strong hammershock. Good agreement with the experiment is achieved in both cases.

Numbers, Keith E.


Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis  

PubMed Central

Neither practitioners nor scientists appear to be fully satisfied with the world's largest behavior-analytic membership organization. Each community appears to believe that initiatives that serve the other will undermine the association's capacity to serve their own needs. Historical examples suggest that such discord is predicted when practitioners and scientists cohabit the same association. This is true because all professional associations exist to address guild interests, and practice and science are different professions with different guild interests. No association, therefore, can succeed in being all things to all people. The solution is to assure that practice and science communities are well served by separate professional associations. I comment briefly on how this outcome might be promoted. PMID:22532750

Critchfield, Thomas S



Practical Blended Taint Analysis for JavaScript Shiyi Wei and Barbara G. Ryder  

E-print Network

Practical Blended Taint Analysis for JavaScript Shiyi Wei and Barbara G. Ryder Department of Computer Science Virginia Tech, USA {wei, ryder} ABSTRACT JavaScript is widely used in Web analysis, an instantiation of our general-purpose analysis framework for JavaScript, to illus- trate how

Ryder, Barbara G.


Language Ideology or Language Practice? An Analysis of Language Policy Documents at Swedish Universities  

ERIC Educational Resources Information Center

This article presents an analysis and interpretation of language policy documents from eight Swedish universities with regard to intertextuality, authorship and content analysis of the notions of language practices and English as a lingua franca (ELF). The analysis is then linked to Spolsky's framework of language policy, namely language…

Björkman, Beyza



Soil Chemical Analysis COURSE DESCRIPTION: Principles and practices used in analytical laboratories for  

E-print Network

Soil Chemical Analysis SWS 5424C COURSE DESCRIPTION: Principles and practices used in analytical' Instruments 7 Theory of spectroscopy (UV-Vis, IR, NMR, MS) 8 Theory of elemental and ion analysis of C, H, N Officer 3 Basic Laboratory Mathematics (Dilutions, Significant figures, Dimensional analysis, MS Excel

Ma, Lena



EPA Science Inventory

A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...


Analysis of Best Hydraulic Fracturing Practices in the Golden Trend Fields of Oklahoma Shahab D. Mohaghegh, West Virginia University  

E-print Network

Analysis of Best Hydraulic Fracturing Practices in the Golden Trend Fields of Oklahoma Shahab D best practices analysis methodology. The study was performed for gas and oil bearing formations. Among to perform the best practices analysis on the Golden Trend fields of Oklahoma is presented. CONCLUSIONS Wells

Mohaghegh, Shahab


Markov Chain Monte Carlo Linkage Analysis Methods  

Microsoft Academic Search

As alluded to in the chapter “Linkage Analysis of Qualitative Traits”, neither the Elston–Steward algorithm nor the Lander–Green\\u000a approach is amenable to genetic data from large complex pedigrees and a large number of markers. In such cases, Monte Carlo\\u000a estimation methods provide a viable alternative to the exact solutions. Two types of Monte Carlo methods have been developed\\u000a for linkage

Robert P. Igo; Yuqun Luo; Shili Lin


Iterative methods for design sensitivity analysis  

NASA Technical Reports Server (NTRS)

A numerical method is presented for design sensitivity analysis, using an iterative-method reanalysis of the structure generated by a small perturbation in the design variable; a forward-difference scheme is then employed to obtain the approximate sensitivity. Algorithms are developed for displacement and stress sensitivity, as well as for eignevalues and eigenvector sensitivity, and the iterative schemes are modified so that the coefficient matrices are constant and therefore decomposed only once.

Belegundu, A. D.; Yoon, B. G.



Design analysis, robust methods, and stress classification  

SciTech Connect

This special edition publication volume is comprised of papers presented at the 1993 ASME Pressure Vessels and Piping Conference, July 25--29, 1993 in Denver, Colorado. The papers were prepared for presentations in technical sessions developed under the auspices of the PVPD Committees on Computer Technology, Design and Analysis, Operations Applications and Components. The topics included are: Analysis of Pressure Vessels and Components; Expansion Joints; Robust Methods; Stress Classification; and Non-Linear Analysis. Individual papers have been processed separately for inclusion in the appropriate data bases.

Bees, W.J. (ed.)



Probabilistic structural analysis methods development for SSME  

NASA Technical Reports Server (NTRS)

The development of probabilistic structural analysis methods is a major part of the SSME Structural Durability Program and consists of three program elements: composite load spectra, probabilistic finite element structural analysis, and probabilistic structural analysis applications. Recent progress includes: (1) the effects of the uncertainties of several factors on the HPFP blade temperature pressure and torque, (2) the evaluation of the cumulative distribution function of structural response variables based on assumed uncertainties on primitive structural variables, and (3) evaluation of the failure probability. Collectively, the results obtained demonstrate that the structural durability of critical SSME components can be probabilistically evaluated.

Chamis, C. C.; Hopkins, D. A.



The evolution of nursing in Australian general practice: a comparative analysis of workforce surveys ten years on  

PubMed Central

Background Nursing in Australian general practice has grown rapidly over the last decade in response to government initiatives to strengthen primary care. There are limited data about how this expansion has impacted on the nursing role, scope of practice and workforce characteristics. This study aimed to describe the current demographic and employment characteristics of Australian nurses working in general practice and explore trends in their role over time. Methods In the nascence of the expansion of the role of nurses in Australian general practice (2003–2004) a national survey was undertaken to describe nurse demographics, clinical roles and competencies. This survey was repeated in 2009–2010 and comparative analysis of the datasets undertaken to explore workforce changes over time. Results Two hundred eighty four nurses employed in general practice completed the first survey (2003/04) and 235 completed the second survey (2009/10). Significantly more participants in Study 2 were undertaking follow-up of pathology results, physical assessment and disease specific health education. There was also a statistically significant increase in the participants who felt that further education/training would augment their confidence in all clinical tasks (p?practice decreased between the two time points, more participants perceived lack of space, job descriptions, confidence to negotiate with general practitioners and personal desire to enhance their role as barriers. Access to education and training as a facilitator to nursing role expansion increased between the two studies. The level of optimism of participants for the future of the nurses’ role in general practice was slightly decreased over time. Conclusions This study has identified that some of the structural barriers to nursing in Australian general practice have been addressed over time. However, it also identifies continuing barriers that impact practice nurse role development. Understanding and addressing these issues is vital to optimise the effectiveness of the primary care nursing workforce. PMID:24666420



Practical hyperdynamics method for systems with large changes in potential energy.  


A practical hyperdynamics method is proposed to accelerate systems with highly endothermic and exothermic reactions such as hydrocarbon pyrolysis and oxidation reactions. In this method, referred to as the "adaptive hyperdynamics (AHD) method," the bias potential parameters are adaptively updated according to the change in potential energy. The approach is intensively examined for JP-10 (exo-tetrahydrodicyclopentadiene) pyrolysis simulations using the ReaxFF reactive force field. Valid boost parameter ranges are clarified as a result. It is shown that AHD can be used to model pyrolysis at temperatures as low as 1000 K while achieving a boost factor of around 10(5). PMID:25527921

Hirai, Hirotoshi



Practical hyperdynamics method for systems with large changes in potential energy  

NASA Astrophysics Data System (ADS)

A practical hyperdynamics method is proposed to accelerate systems with highly endothermic and exothermic reactions such as hydrocarbon pyrolysis and oxidation reactions. In this method, referred to as the "adaptive hyperdynamics (AHD) method," the bias potential parameters are adaptively updated according to the change in potential energy. The approach is intensively examined for JP-10 (exo-tetrahydrodicyclopentadiene) pyrolysis simulations using the ReaxFF reactive force field. Valid boost parameter ranges are clarified as a result. It is shown that AHD can be used to model pyrolysis at temperatures as low as 1000 K while achieving a boost factor of around 105.

Hirai, Hirotoshi




NASA Astrophysics Data System (ADS)

Fatigue cracks have been recently reported at weld root of deckplate-U type rib connection in orthotropic steel deck bridges on heavy traffic route. These cracks propagate to the upper surface of deckplate, that makes it difficult to detect them during visual inspection. With the purpose of developing a reliable and practical non-destructive inspection method for the cracks, this paper discusses an ultrasonic testing method using SV waves by critical angle beam probe. A reliable technique for the sensitivity calibration was proposed. Based on ultrasonic testing for fatigue crack specimens and a damaged deckplate on actual bridge, applicability of the proposed ultrasonic inspection method was confirmed.

Murakoshi, Jun; Takahashi, Minoru; Koike, Mitsuhiro; Kimura, Tomonori


Safety and Efficacy of IV-TPA for Ischaemic Stroke in Clinical Practice – A Bayesian Analysis  

Microsoft Academic Search

Background: Observational studies of new treatments in routine practice are clinically important but may be limited by bias. We used a Bayesian approach to interpret and sequentially combine phase 4 studies of IV-TPA within 3 h for acute ischaemic stroke to quantify the cumulative evidence for the efficacy and safety of this therapy in clinical practice. Methods: Prior probability distributions

Killian E. T. O’Rourke; Cathal D. Walsh; Peter J. Kelly



Reform-based science teaching: A mixed-methods approach to explaining variation in secondary science teacher practice  

NASA Astrophysics Data System (ADS)

The purpose of this two-phase, sequential explanatory mixed-methods study was to understand and explain the variation seen in secondary science teachers' enactment of reform-based instructional practices. Utilizing teacher socialization theory, this mixed-methods analysis was conducted to determine the relative influence of secondary science teachers' characteristics, backgrounds and experiences across their teacher development to explain the range of teaching practices exhibited by graduates from three reform-oriented teacher preparation programs. Data for this study were obtained from the Investigating the Meaningfulness of Preservice Programs Across the Continuum of Teaching (IMPPACT) Project, a multi-university, longitudinal study funded by NSF. In the first quantitative phase of the study, data for the sample (N=120) were collected from three surveys from the IMPPACT Project database. Hierarchical multiple regression analysis was used to examine the separate as well as the combined influence of factors such as teachers' personal and professional background characteristics, beliefs about reform-based science teaching, feelings of preparedness to teach science, school context, school culture and climate of professional learning, and influences of the policy environment on the teachers' use of reform-based instructional practices. Findings indicate three blocks of variables, professional background, beliefs/efficacy, and local school context added significant contribution to explaining nearly 38% of the variation in secondary science teachers' use of reform-based instructional practices. The five variables that significantly contributed to explaining variation in teachers' use of reform-based instructional practices in the full model were, university of teacher preparation, sense of preparation for teaching science, the quality of professional development, science content focused professional, and the perceived level of professional autonomy. Using the results from phase one, the second qualitative phase selected six case study teachers based on their levels of reform-based teaching practices to highlight teachers across the range of practices from low, average, to high levels of implementation. Using multiple interview sources, phase two helped to further explain the variation in levels of reform-based practices. Themes related to teachers' backgrounds, local contexts, and state policy environments were developed as they related to teachers' socialization experiences across these contexts. The results of the qualitative analysis identified the following factors differentiating teachers who enacted reform-based instructional practices from those who did not: 1) extensive science research experiences prior to their preservice teacher preparation; 2) the structure and quality of their field placements; 3) developing and valuing a research-based understanding of teaching and learning as a result of their preservice teacher preparation experiences; 4) the professional culture of their school context where there was support for a high degree of professional autonomy and receiving support from "educational companions" with a specific focus on teacher pedagogy to support student learning; and 5) a greater sense of agency to navigate their districts' interpretation and implementation of state polices. Implications for key stakeholders as well as directions for future research are discussed.

Jetty, Lauren E.


Particle size analysis of nanocrystals: improved analysis method.  


The influence of optical parameters, additional techniques (e.g. PIDS technology) and the importance of light microscopy were investigated by comparing laser diffraction data obtained via the conventional method and an optimized analysis method. Also the influence of a possible dissolution of nanocrystals during a measurement on the size result obtained was assessed in this study. The results reveal that dissolution occurs if unsaturated medium or microparticle saturated medium is used for the measurements. The dissolution is erratic and the results are not reproducible. Dissolution can be overcome by saturating the measuring medium prior to the measurement. If nanocrystals are analysed the dispersion medium should be saturated with the nanocrystals, because the solubility is higher than for coarse micro-sized drug material. The importance of using the optimized analysis method was proven by analysing 40 different nanosuspensions via the conventional versus the optimized sizing method. There was no large difference in the results obtained for the 40 nanosuspensions using the conventional method. This would have led to the conclusion, that all the 40 formulations investigated are physically stable. However, the analysis via the optimized method revealed that from 40 formulations investigated only four were physically stable. In conclusion an optimized analysis saves time and money and avoids misleading developments, because discrimination between "stable" and "unstable" can be done reliably at a very early stage of the development. PMID:19733647

Keck, Cornelia M



Some practical procedures in computerized thermal neutron activation analysis with Ge(Li) gamma-ray spectrometry  

Microsoft Academic Search

The practical use of a correction procedure for random coincidence losses, the determination of the detection limit and the\\u000a standardization of measuring conditions are described. Special correction methods for the interference of the Cu analysis\\u000a by24Na, the burn-up of the radioactive nuclide formed and the interference of the Pt determination by Au are also given.

M. L. Verheijke




Technology Transfer Automated Retrieval System (TEKTRAN)

This chapter introduces computational methods for analysis of microarray data including gene clustering, marker gene selection, prediction of phenotypic classes, and modeling of genetic networks. As large volume and high dimensional data are being generated by the rapidly expanding microarray techno...


Integrated method for chaotic time series analysis  


Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

Hively, L.M.; Ng, E.G.



Characterization of Assembly Variation Analysis Methods  

E-print Network

Master of Science Robert Cvetko 1997 By Robert Cvetko December 1997 #12;ii This thesis by Robert Cvetko D. Sorensen, Committee Member Spencer P. Magleby, Committee Member Date Craig Smith, Graduate Coordinator #12;Characterization of Assembly Variation Analysis Methods Robert Cvetko Department of Mechanical


Methods for Analysis of LPI Radar Signals  

Microsoft Academic Search

LPI (low probability of intercept) radars occupy wide frequency bands and have very low peak power so they are difficult to be detected by hostile intercept receivers. Hostile radiometric receivers are not able to intercept and measure the parameters of LPI signals which lie in wide frequency bands. In this study, four different methods for the analysis of LPI signals;

C. Tezel; Y. Ozkazanc



Component Analysis Methods for Computer Vision and  

E-print Network

1 Component Analysis Methods for Computer Vision and Pattern Recognition Fernando De la TorreFernando De la Torre Computer Vision and Pattern Recognition Easter SchoolComputer Vision and Pattern Vision and Pattern Recognition Easter SchoolComputer Vision and Pattern Recognition Easter School March

Botea, Adi


An analysis of the light changes of eclipsing variables in the frequency-domain - Practical aspects  

NASA Astrophysics Data System (ADS)

Practical aspects of the analysis of the light changes of eclipsing binary systems in the frequency domain are reviewed, and the advantages of this process over the time-domain approach are pointed out. A direct solution of the problem for the case of total eclipses is given which, in the frequency domain, can be completely algebraized, requiring no tables of any special functions. A generalization of this process to any type of eclipses is given, and ways are shown to deduce the uncertainty of the elements of the eclipses from that of the moments A(2m) of the eclipses. Methods to extend these techniques to eclipsing systems whose light changes are not limited to the times of minima alone are given, and physical processes which are likely to produce the light-curve asymmetries noted in many close binary systems are considered.

Kopal, Zdenek


Designing sociotechnical systems with cognitive work analysis: putting theory back into practice.  


Cognitive work analysis (CWA) is a framework of methods for analysing complex sociotechnical systems. However, the translation from the outputs of CWA to design is not straightforward. Sociotechnical systems theory provides values and principles for the design of sociotechnical systems which may offer a theoretically consistent basis for a design approach for use with CWA. This article explores the extent to which CWA and sociotechnical systems theory offer complementary perspectives and presents an abstraction hierarchy (AH), based on a review of literature, that describes an 'optimal' CWA and sociotechnical systems theory design system. The optimal AH is used to assess the extent to which current CWA-based design practices, uncovered through a survey of CWA practitioners, aligns with sociotechnical systems theory. Recommendations for a design approach that would support the integration of CWA and sociotechnical systems theory design values and principles are also derived. PMID:25407778

Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Stanton, Neville A



Methods of quantitative fire hazard analysis  

SciTech Connect

Simplified fire hazard analysis methods have been developed as part of the FIVE risk-based fire induced vulnerability evaluation methodology for nuclear power plants. These fire hazard analyses are intended to permit plant fire protection personnel to conservatively evaluate the potential for credible exposure fires to cause critical damage to essential safe-shutdown equipment and thereby screen from further analysis spaces where a significant fire hazard clearly does not exist. This document addresses the technical bases for the fire hazard analysis methods. A separate user's guide addresses the implementation of the fire screening methodology, which has been implemented with three worksheets and a number of look-up tables. The worksheets address different locations of targets relative to exposure fire sources. The look-up tables address fire-induced conditions in enclosures in terms of three stages: a fire plume/ceiling jet period, an unventilated enclosure smoke filling period and a ventilated quasi-steady period.

Mowrer, F.W. (Mowrer (Frederick W.), Adelphi, MD (United States))



Multiple predictor smoothing methods for sensitivity analysis.  

SciTech Connect

The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

Helton, Jon Craig; Storlie, Curtis B.



Power System Transient Stability Analysis through a Homotopy Analysis Method  

SciTech Connect

As an important function of energy management systems (EMSs), online contingency analysis plays an important role in providing power system security warnings of instability. At present, N-1 contingency analysis still relies on time-consuming numerical integration. To save computational cost, the paper proposes a quasi-analytical method to evaluate transient stability through time domain periodic solutions’ frequency sensitivities against initial values. First, dynamic systems described in classical models are modified into damping free systems whose solutions are either periodic or expanded (non-convergent). Second, because the sensitivities experience sharp changes when periodic solutions vanish and turn into expanded solutions, transient stability is assessed using the sensitivity. Third, homotopy analysis is introduced to extract frequency information and evaluate the sensitivities only from initial values so that time consuming numerical integration is avoided. Finally, a simple case is presented to demonstrate application of the proposed method, and simulation results show that the proposed method is promising.

Wang, Shaobu; Du, Pengwei; Zhou, Ning



Cask crush pad analysis using detailed and simplified analysis methods  

SciTech Connect

A crush pad has been designed and analyzed to absorb the kinetic energy of a hypothetically dropped spent nuclear fuel shipping cask into a 44-ft. deep cask unloading pool at the Fluorinel and Storage Facility (FAST). This facility, located at the Idaho Chemical Processing Plant (ICPP) at the Idaho national Engineering and Environmental Laboratory (INEEL), is a US Department of Energy site. The basis for this study is an analysis by Uldrich and Hawkes. The purpose of this analysis was to evaluate various hypothetical cask drop orientations to ensure that the crush pad design was adequate and the cask deceleration at impact was less than 100 g. It is demonstrated herein that a large spent fuel shipping cask, when dropped onto a foam crush pad, can be analyzed by either hand methods or by sophisticated dynamic finite element analysis using computer codes such as ABAQUS. Results from the two methods are compared to evaluate accuracy of the simplified hand analysis approach.

Uldrich, E.D.; Hawkes, B.D.



Structural correlation method for model reduction and practical estimation of patient specific parameters illustrated on heart rate regulation.  


We consider the inverse and patient specific problem of short term (seconds to minutes) heart rate regulation specified by a system of nonlinear ODEs and corresponding data. We show how a recent method termed the structural correlation method (SCM) can be used for model reduction and for obtaining a set of practically identifiable parameters. The structural correlation method includes two steps: sensitivity and correlation analysis. When combined with an optimization step, it is possible to estimate model parameters, enabling the model to fit dynamics observed in data. This method is illustrated in detail on a model predicting baroreflex regulation of heart rate and applied to analysis of data from a rat and healthy humans. Numerous mathematical models have been proposed for prediction of baroreflex regulation of heart rate, yet most of these have been designed to provide qualitative predictions of the phenomena though some recent models have been developed to fit observed data. In this study we show that the model put forward by Bugenhagen et al. [2] can be simplified without loss of its ability to predict measured data and to be interpreted physiologically. Moreover, we show that with minimal changes in nominal parameter values the simplified model can be adapted to predict observations from both rats and humans. The use of these methods make the model suitable for estimation of parameters from individuals, allowing it to be adopted for diagnostic procedures. PMID:25050793

Ottesen, Johnny T; Mehlsen, Jesper; Olufsen, Mette S



Text analysis devices, articles of manufacture, and text analysis methods  


Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.

Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C



Critical Discourse Analysis: Discourse Acquisition and Discourse Practices.  

ERIC Educational Resources Information Center

Explores arguments around critical-discourse analysis (CDA) and suggests that neither proponents nor critics of CDA have fully come to terms with the implications of what it means to acquire discourse. (Author/VWL)

Price, Steve



EMQN Best Practice Guidelines for molecular and haematology methods for carrier identification and prenatal diagnosis of the haemoglobinopathies.  


Haemoglobinopathies constitute the commonest recessive monogenic disorders worldwide, and the treatment of affected individuals presents a substantial global disease burden. Carrier identification and prenatal diagnosis represent valuable procedures that identify couples at risk for having affected children, so that they can be offered options to have healthy offspring. Molecular diagnosis facilitates prenatal diagnosis and definitive diagnosis of carriers and patients (especially 'atypical' cases who often have complex genotype interactions). However, the haemoglobin disorders are unique among all genetic diseases in that identification of carriers is preferable by haematological (biochemical) tests rather than DNA analysis. These Best Practice guidelines offer an overview of recommended strategies and methods for carrier identification and prenatal diagnosis of haemoglobinopathies, and emphasize the importance of appropriately applying and interpreting haematological tests in supporting the optimum application and evaluation of globin gene DNA analysis.European Journal of Human Genetics advance online publication, 23 July 2014; doi:10.1038/ejhg.2014.131. PMID:25052315

Traeger-Synodinos, Joanne; Harteveld, Cornelis L; Old, John M; Petrou, Mary; Galanello, Renzo; Giordano, Piero; Angastioniotis, Michael; De la Salle, Barbara; Henderson, Shirley; May, Alison



Finite element methods for integrated aerodynamic heating analysis  

NASA Technical Reports Server (NTRS)

Over the past few years finite element based procedures for the solution of high speed viscous compressible flows were developed. The objective of this research is to build upon the finite element concepts which have already been demonstrated and to develop these ideas to produce a method which is applicable to the solution of large scale practical problems. The problems of interest range from three dimensional full vehicle Euler simulations to local analysis of three-dimensional viscous laminar flow. Transient Euler flow simulations involving moving bodies are also to be included. An important feature of the research is to be the coupling of the flow solution methods with thermal/structural modeling techniques to provide an integrated fluid/thermal/structural modeling capability. The progress made towards achieving these goals during the first twelve month period of the research is presented.

Peraire, J.



Structural sensitivity analysis: Methods, applications, and needs  

NASA Technical Reports Server (NTRS)

Some innovative techniques applicable to sensitivity analysis of discretized structural systems are reviewed. These techniques include a finite-difference step-size selection algorithm, a method for derivatives of iterative solutions, a Green's function technique for derivatives of transient response, a simultaneous calculation of temperatures and their derivatives, derivatives with respect to shape, and derivatives of optimum designs with respect to problem parameters. Computerized implementations of sensitivity analysis and applications of sensitivity derivatives are also discussed. Finally, some of the critical needs in the structural sensitivity area are indicated along with Langley plans for dealing with some of these needs.

Adelman, H. M.; Haftka, R. T.; Camarda, C. J.; Walsh, J. L.



Structural sensitivity analysis: Methods, applications and needs  

NASA Technical Reports Server (NTRS)

Innovative techniques applicable to sensitivity analysis of discretized structural systems are reviewed. The techniques include a finite difference step size selection algorithm, a method for derivatives of iterative solutions, a Green's function technique for derivatives of transient response, simultaneous calculation of temperatures and their derivatives, derivatives with respect to shape, and derivatives of optimum designs with respect to problem parameters. Computerized implementations of sensitivity analysis and applications of sensitivity derivatives are also discussed. Some of the critical needs in the structural sensitivity area are indicated along with plans for dealing with some of those needs.

Adelman, H. M.; Haftka, R. T.; Camarda, C. J.; Walsh, J. L.



Spectroscopic Chemical Analysis Methods and Apparatus  

NASA Technical Reports Server (NTRS)

This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses ballistic electron beam injection directly into the active region of a wide bandgap semiconductor material.

Hug, William F.; Reid, Ray D.



Alignment of patient and primary care practice member perspectives of chronic illness care: a cross-sectional analysis  

PubMed Central

Background Little is known as to whether primary care teams’ perceptions of how well they have implemented the Chronic Care Model (CCM) corresponds with their patients’ own experience of chronic illness care. We examined the extent to which practice members’ perceptions of how well they organized to deliver care consistent with the CCM were associated with their patients’ perceptions of the chronic illness care they have received. Methods Analysis of baseline measures from a cluster randomized controlled trial testing a practice facilitation intervention to implement the CCM in small, community-based primary care practices. All practice “members” (i.e., physician providers, non-physician providers, and staff) completed the Assessment of Chronic Illness Care (ACIC) survey and adult patients with 1 or more chronic illnesses completed the Patient Assessment of Chronic Illness Care (PACIC) questionnaire. Results Two sets of hierarchical linear regression models accounting for nesting of practice members (N?=?283) and patients (N?=?1,769) within 39 practices assessed the association between practice member perspectives of CCM implementation (ACIC scores) and patients’ perspectives of CCM (PACIC). ACIC summary score was not significantly associated with PACIC summary score or most of PACIC subscale scores, but four of the ACIC subscales [Self-management Support (p?practice member perspectives when evaluating quality of chronic illness care. Trial registration NCT00482768 PMID:24678983



Comparison of analysis methods for airway quantification  

NASA Astrophysics Data System (ADS)

Diseased airways have been known for several years as a possible contributing factor to airflow limitation in Chronic Obstructive Pulmonary Diseases (COPD). Quantification of disease severity through the evaluation of airway dimensions - wall thickness and lumen diameter - has gained increased attention, thanks to the availability of multi-slice computed tomography (CT). Novel approaches have focused on automated methods of measurement as a faster and more objective means that the visual assessment routinely employed in the clinic. Since the Full-Width Half-Maximum (FWHM) method of airway measurement was introduced two decades ago [1], several new techniques for quantifying airways have been detailed in the literature, but no approach has truly become a standard for such analysis. Our own research group has presented two alternative approaches for determining airway dimensions, one involving a minimum path and the other active contours [2, 3]. With an increasing number of techniques dedicated to the same goal, we decided to take a step back and analyze the differences of these methods. We consequently put to the test our two methods of analysis and the FWHM approach. We first measured a set of 5 airways from a phantom of known dimensions. Then we compared measurements from the three methods to those of two independent readers, performed on 35 airways in 5 patients. We elaborate on the differences of each approach and suggest conclusions on which could be defined as the best one.

Odry, Benjamin L.; Kiraly, Atilla P.; Novak, Carol L.; Naidich, David P.



Measurement methods for human exposure analysis.  

PubMed Central

The general methods used to complete measurements of human exposures are identified and illustrations are provided for the cases of indirect and direct methods used for exposure analysis. The application of the techniques for external measurements of exposure, microenvironmental and personal monitors, are placed in the context of the need to test hypotheses concerning the biological effects of concern. The linkage of external measurements to measurements made in biological fluids is explored for a suite of contaminants. This information is placed in the context of the scientific framework used to conduct exposure assessment. Examples are taken from research on volatile organics and for a large scale problem: hazardous waste sites. PMID:7635110

Lioy, P J



A review on the decoy-state method for practical quantum key distribution  

E-print Network

We present a review on the historic development of the decoy state method, including the background, principles, methods, results and development. We also clarify some delicate concepts. Given an imperfect source and a very lossy channel, the photon-number-splitting (PNS) attack can make the quantum key distribution (QKD) in practice totally insecure. Given the result of ILM-GLLP, one knows how to distill the secure final key if he knows the fraction of tagged bits. The purpose of decoy state method is to do a tight verification of the the fraction of tagged bits. The main idea of decoy-state method is changing the intensities of source light and one can verify the fraction of tagged bits of certain intensity by watching the the counting rates of pulses of different intensities. Since the counting rates are small quantities, the effect of statistical fluctuation is very important. It has been shown that 3-state decoy-state method in practice can work even with the fluctuations and other errors.

Xiang-Bin Wang



Digital dream analysis: a revised method.  


This article demonstrates the use of a digital word search method designed to provide greater accuracy, objectivity, and speed in the study of dreams. A revised template of 40 word search categories, built into the website of the Sleep and Dream Database (SDDb), is applied to four "classic" sets of dreams: The male and female "Norm" dreams of Hall and Van de Castle (1966), the "Engine Man" dreams discussed by Hobson (1988), and the "Barb Sanders Baseline 250" dreams examined by Domhoff (2003). A word search analysis of these original dream reports shows that a digital approach can accurately identify many of the same distinctive patterns of content found by previous investigators using much more laborious and time-consuming methods. The results of this study emphasize the compatibility of word search technologies with traditional approaches to dream content analysis. PMID:25286125

Bulkeley, Kelly



A method for communication analysis in prosthodontics.  


Particularly in prosthodontics, in which the issues of esthetic preferences and possibilities are abundant, improved knowledge about dentist patient communication during clinical encounters is important. Because previous studies on communication used different methods and patient materials, the results are difficult to evaluate. There is, therefore, a need for methodologic development. One method that makes it possible to quantitatively describe different interaction behaviors during clinical encounters is the Roter Method of Interaction Process Analysis (RIAS). Since the method was developed in the USA for use in the medical context, a translation of the method into Swedish and a modification of the categories for use in prosthodontics were necessary. The revised manual was used to code 10 audio recordings of dentist patient encounters at a specialist clinic for prosthodontics. No major alterations of the RIAS manual were made during the translation and modification. The study shows that it is possible to distinguish patterns of communication in audio-recorded dentist patient encounters. The method also made the identification of different interaction profiles possible. These profiles distinguished well among the audio-recorded encounters. The coding procedures were tested for intra-rater reliability and found to be 97% for utterance classification and lambda = 0.76 for categorization definition. It was concluded that the revised RIAS method is applicable in communication studies in prosthodontics. PMID:9537735

Sondell, K; Söderfeldt, B; Palmqvist, S



A Simplified Mesh Deformation Method Using Commercial Structural Analysis Software  

NASA Technical Reports Server (NTRS)

Mesh deformation in response to redefined or moving aerodynamic surface geometries is a frequently encountered task in many applications. Most existing methods are either mathematically too complex or computationally too expensive for usage in practical design and optimization. We propose a simplified mesh deformation method based on linear elastic finite element analyses that can be easily implemented by using commercially available structural analysis software. Using a prescribed displacement at the mesh boundaries, a simple structural analysis is constructed based on a spatially varying Young s modulus to move the entire mesh in accordance with the surface geometry redefinitions. A variety of surface movements, such as translation, rotation, or incremental surface reshaping that often takes place in an optimization procedure, may be handled by the present method. We describe the numerical formulation and implementation using the NASTRAN software in this paper. The use of commercial software bypasses tedious reimplementation and takes advantage of the computational efficiency offered by the vendor. A two-dimensional airfoil mesh and a three-dimensional aircraft mesh were used as test cases to demonstrate the effectiveness of the proposed method. Euler and Navier-Stokes calculations were performed for the deformed two-dimensional meshes.

Hsu, Su-Yuen; Chang, Chau-Lyan; Samareh, Jamshid



A Practical Blended Analysis for Dynamic Features in JavaScript  

E-print Network

A Practical Blended Analysis for Dynamic Features in JavaScript Shiyi Wei and Barbara G. Ryder Department of Computer Science Virginia Tech {wei, ryder} Abstract--JavaScript is widely used in Web applications; however, its dynamism renders static analysis ineffective. Our JavaScript Blended

Ryder, Barbara G.


Practical Static Analysis of JavaScript Applications in the Presence of Frameworks and Libraries  

E-print Network

Practical Static Analysis of JavaScript Applications in the Presence of Frameworks and Libraries Corporation Microsoft Research Technical Report MSR-TR-2012-66 #12;Abstract JavaScript is a language system. Analysis of JavaScript has long been known to be challenging due to the language's dynamic nature

Livshits, Ben


Practical Static Analysis of JavaScript Applications in the Presence of Frameworks and Libraries  

E-print Network

Practical Static Analysis of JavaScript Applications in the Presence of Frameworks and Libraries Microsoft Corporation, USA ABSTRACT JavaScript is a language that is widely-used for both web- based and standalone applications such as those in the Win- dows 8 operating system. Analysis of JavaScript has long

Livshits, Ben


New methods of radar performances analysis  

Microsoft Academic Search

Original methods for radar detection performance analysis are derived for a fluctuating or non-fluctuating target embedded in additive and a priori unknown noise. This kind of noise can be, for example, the sea or ground clutter encountered in surface-based radar for the detection of low grazing angle targets and\\/or in high-resolution radar. In these cases, the spiky clutter tends to

Emmanuelle Jay; Jean Philippe Ovarlez; Patrick Duvaut



Stirling Analysis Comparison of Commercial vs. High-Order Methods  

NASA Technical Reports Server (NTRS)

Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/ proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's Compact scheme and Dyson s Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model although sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako



Stirling Analysis Comparison of Commercial Versus High-Order Methods  

NASA Technical Reports Server (NTRS)

Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's compact scheme and Dyson's Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model with sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako



Permutation Testing Made Practical for Functional Magnetic Resonance Image Analysis  

Microsoft Academic Search

We describe an efficient algorithm for the step-down permutation test, applied to the analysis of functional magnetic resonance images. The algorithm's time bound is nearly linear, making it feasible as an interactive tool. Results of the permutation test algorithm applied to data from a cognitive activation paradigm are compared with those of a standard parametric test corrected for multiple comparisons.

Matthew Belmonte; Deborah Yurgelun-todd



An Analysis of Ethical Considerations in Programme Design Practice  

ERIC Educational Resources Information Center

Ethical considerations are inherent to programme design decision-making, but not normally explicit. Nonetheless, they influence whose interests are served in a programme and who benefits from it. This paper presents an analysis of ethical considerations made by programme design practitioners in the context of a polytechnic in Aotearoa/New Zealand.…

Govers, Elly



Suspension, Race, and Disability: Analysis of Statewide Practices and Reporting  

ERIC Educational Resources Information Center

This analysis of statewide suspension data from 1995 to 2003 in Maryland investigated disproportionate suspensions of minority students and students with disabilities. We found substantial increases in over-all rates of suspensions from 1995 to 2003, as well as disproportionate rates of suspensions for African American students, American Indian…

Krezmien, Michael P.; Leone, Peter E.; Achilles, Georgianna M.



Digital Data Collection and Analysis: Application for Clinical Practice  

ERIC Educational Resources Information Center

Technology for digital speech recording and speech analysis is now readily available for all clinicians who use a computer. This article discusses some advantages of moving from analog to digital recordings and outlines basic recording procedures. The purpose of this article is to familiarize speech-language pathologists with computerized audio…

Ingram, Kelly; Bunta, Ferenc; Ingram, David



Newborn Hearing Screening: An Analysis of Current Practices  

ERIC Educational Resources Information Center

State coordinators of early hearing detection and intervention (EHDI) programs completed a strengths, weaknesses, opportunities, and threats, or SWOT, analysis that consisted of 12 evaluative areas of EHDI programs. For the newborn hearing screening area, a total of 293 items were listed by 49 EHDI coordinators, and themes were identified within…

Houston, K. Todd; Bradham, Tamala S.; Munoz, Karen F.; Guignard, Gayla Hutsell



Spectroscopic chemical analysis methods and apparatus  

NASA Technical Reports Server (NTRS)

Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)



Deriving a practical analytical-probabilistic method to size flood routing reservoirs  

NASA Astrophysics Data System (ADS)

In the engineering practice routing reservoir sizing is commonly performed by using the design storm method, although its effectiveness has been debated for a long time. Conversely, continuous simulations and direct statistical analyses of recorded hydrographs are considered more reliable and comprehensive, but are indeed complex or seldom practicable. In this paper a handier tool is provided by the analytical-probabilistic approach to construct probability functions of peak discharges issuing from natural watersheds or routed through on-line and off-line reservoirs. A simplified routing scheme and a rainfall-runoff model based on a few essential hydrological parameters were implemented. To validate the proposed design methodology, on-line and off-line routing reservoirs were firstly sized by means of a conventional design storm method for a test watershed located in northern Italy. Their routing efficiencies were then estimated by both analytical-probabilistic models and benchmarking continuous simulations. Bearing in mind practical design purposes, adopted models evidenced a satisfactory consistency.

Balistrocchi, Matteo; Grossi, Giovanna; Bacchi, Baldassare



Are parents' knowledge and practice regarding immunization related to pediatrics’ immunization compliance? a mixed method study  

PubMed Central

Background Immunization rate is one of the best public health outcome and service indicators of the last 100 years. Parental decisions regarding immunization are very important to improve immunization rate. The aim of this study was to evaluate the correlation between parental knowledge-practices (KP) and children's immunization completeness. Methods A mixed method has been utilized in this study: a retrospective cohort study was used to evaluate immunization completeness; a prospective cross-sectional study was used to evaluate immunization KP of parents. 528 children born between 1 January 2003 and 31 June 2008 were randomly selected from five public health clinics in Mosul, Iraq. Immunization history of each child was collected retrospectively from their immunization record/card. Results About half of studied children (n?=?286, 56.3%) were immunized with all vaccination doses; these children were considered as having had complete immunization. 66.1% of the parents was found to have adequate KP scores. A significant association of immunization completeness with total KP groups (p?practice. The study results reinforce recommendations for the periodic assessment of immunization rate and the use of educational programmes to improve the immunization rate, knowledge and practice. PMID:24460878



Optical methods for the analysis of dermatopharmacokinetics  

NASA Astrophysics Data System (ADS)

The method of tape stripping in combination with spectroscopic measurements is a simple and noninvasive method for the analysis of dermatopharmacokinetics of cosmetic products and topically applied drugs. The absorbance at 430 nm was used for the characterization of the amount of corneocytes on the tape strips. It was compared to the increase of weight of the tapes after removing them from the skin surface. The penetration profiles of two UV filter substances used in sunscreens were determined. The combined method of tape stripping and spectroscopic measurements can be also used for the investigation of the dermatopharmacokinetics of topically applied drugs passing through the skin. Differences in the penetration profiles of the steroid compound clobetasol, applied in the same concentration in different formulations on the skin are presented.

Lademann, Juergen; Weigmann, Hans-Juergen; von Pelchrzim, R.; Sterry, Wolfram



Review of Computational Stirling Analysis Methods  

NASA Technical Reports Server (NTRS)

Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.



Model Checking in Practice: Analysis of Generic Bootloader Using SPIN  

Microsoft Academic Search

\\u000a This work presents a case study of the use of model checking for analyzing an industrial software, the Generic Bootloader.\\u000a Analysis of the software have been carried out using the automated verification system SPIN. A model of the software has been\\u000a developed using the specification language PROMELA, and the properties expressed in the LTL have been verified against the\\u000a model.

Kuntal Das Barman; Debapriyay Mukhopadhyay



Practical guidance for statistical analysis of operational event data  

SciTech Connect

This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

Atwood, C.L.



Influence of Analysis Methods on Interpretation of Hazard Maps  

PubMed Central

Exposure or hazard mapping is becoming increasingly popular among industrial hygienists. Direct-reading instruments used for hazard mapping of data collection are steadily increasing in reliability and portability while decreasing in cost. Exposure measurements made with these instruments generally require no laboratory analysis although hazard mapping can be a time-consuming process. To inform decision making by industrial hygienists and management, it is crucial that the maps generated from mapping data are as accurate and representative as possible. Currently, it is unclear how many sampling locations are necessary to produce a representative hazard map. As such, researchers typically collect as many points as can be sampled in several hours and interpolation methods are used to produce higher resolution maps. We have reanalyzed hazard-mapping data sets from three industrial settings to determine which interpolation methods yield the most accurate results. The goal is to provide practicing industrial hygienists with some practical guidelines to generate accurate hazard maps with ‘off-the-shelf’ mapping software. Visually verifying the fit of the variogram model is crucial for accurate interpolation. Exponential and spherical variogram models performed better than Gaussian models. It was also necessary to diverge from some of the default interpolation parameters such as the number of bins used for the experimental variogram and whether or not to allow for a nugget effect to achieve reasonable accuracy of the interpolation for some data sets. PMID:23258453

Koehler, Kirsten A.



Standard practices for dissolving glass containing radioactive and mixed waste for chemical and radiochemical analysis  

E-print Network

1.1 These practices cover techniques suitable for dissolving glass samples that may contain nuclear wastes. These techniques used together or independently will produce solutions that can be analyzed by inductively coupled plasma atomic emission spectroscopy (ICP-AES), inductively coupled plasma mass spectrometry (ICP-MS), atomic absorption spectrometry (AAS), radiochemical methods and wet chemical techniques for major components, minor components and radionuclides. 1.2 One of the fusion practices and the microwave practice can be used in hot cells and shielded hoods after modification to meet local operational requirements. 1.3 The user of these practices must follow radiation protection guidelines in place for their specific laboratories. 1.4 Additional information relating to safety is included in the text. 1.5 The dissolution techniques described in these practices can be used for quality control of the feed materials and the product of plants vitrifying nuclear waste materials in glass. 1.6 These pr...

American Society for Testing and Materials. Philadelphia



Measuring Racial/Ethnic Disparities in Health Care: Methods and Practical Issues  

PubMed Central

Objective To review methods of measuring racial/ethnic health care disparities. Study Design Identification and tracking of racial/ethnic disparities in health care will be advanced by application of a consistent definition and reliable empirical methods. We have proposed a definition of racial/ethnic health care disparities based in the Institute of Medicine's (IOM) Unequal Treatment report, which defines disparities as all differences except those due to clinical need and preferences. After briefly summarizing the strengths and critiques of this definition, we review methods that have been used to implement it. We discuss practical issues that arise during implementation and expand these methods to identify sources of disparities. We also situate the focus on methods to measure racial/ethnic health care disparities (an endeavor predominant in the United States) within a larger international literature in health outcomes and health care inequality. Empirical Application We compare different methods of implementing the IOM definition on measurement of disparities in any use of mental health care and mental health care expenditures using the 2004–2008 Medical Expenditure Panel Survey. Conclusion Disparities analysts should be aware of multiple methods available to measure disparities and their differing assumptions. We prefer a method concordant with the IOM definition. PMID:22353147

Cook, Benjamin Lê; McGuire, Thomas G; Zaslavsky,, Alan M



A situated practice of ethics for participatory visual and digital methods in public health research and practice: a focus on digital storytelling.  


This article explores ethical considerations related to participatory visual and digital methods for public health research and practice, through the lens of an approach known as "digital storytelling." We begin by briefly describing the digital storytelling process and its applications to public health research and practice. Next, we explore 6 common challenges: fuzzy boundaries, recruitment and consent to participate, power of shaping, representation and harm, confidentiality, and release of materials. We discuss their complexities and offer some considerations for ethical practice. We hope this article serves as a catalyst for expanded dialogue about the need for high standards of integrity and a situated practice of ethics wherein researchers and practitioners reflexively consider ethical decision-making as part of the ongoing work of public health. PMID:23948015

Gubrium, Aline C; Hill, Amy L; Flicker, Sarah



Good modeling practice for PAT applications: propagation of input uncertainty and sensitivity analysis.  


The uncertainty and sensitivity analysis are evaluated for their usefulness as part of the model-building within Process Analytical Technology applications. A mechanistic model describing a batch cultivation of Streptomyces coelicolor for antibiotic production was used as case study. The input uncertainty resulting from assumptions of the model was propagated using the Monte Carlo procedure to estimate the output uncertainty. The results showed that significant uncertainty exists in the model outputs. Moreover the uncertainty in the biomass, glucose, ammonium and base-consumption were found low compared to the large uncertainty observed in the antibiotic and off-gas CO(2) predictions. The output uncertainty was observed to be lower during the exponential growth phase, while higher in the stationary and death phases - meaning the model describes some periods better than others. To understand which input parameters are responsible for the output uncertainty, three sensitivity methods (Standardized Regression Coefficients, Morris and differential analysis) were evaluated and compared. The results from these methods were mostly in agreement with each other and revealed that only few parameters (about 10) out of a total 56 were mainly responsible for the output uncertainty. Among these significant parameters, one finds parameters related to fermentation characteristics such as biomass metabolism, chemical equilibria and mass-transfer. Overall the uncertainty and sensitivity analysis are found promising for helping to build reliable mechanistic models and to interpret the model outputs properly. These tools make part of good modeling practice, which can contribute to successful PAT applications for increased process understanding, operation and control purposes. PMID:19569187

Sin, Gürkan; Gernaey, Krist V; Lantz, Anna Eliasson



A Soft Computing-Based Method for the Identification of Best Practices, With Application in the Petroleum Industry  

E-print Network

introduces a new and novel methodology for fully data-driven best practices identification and analysis based on soft computing techniques. Using this new methodology "best practices" in any operation (industrial is presented. CONCLUSIONS A new methodology has been developed and introduced for identification of best

Mohaghegh, Shahab


Life-cycle cost analysis of design practices for RC framed structures  

Microsoft Academic Search

The objective of this study is to perform life-cycle cost analysis on three design practices namely weak ground storey, short\\u000a and floating columns and their combinations. Life-cycle cost analysis is recognized as the only suitable tool for assessing\\u000a the structural performance when the structure is expected to be functional for a long period of time. Life-cycle cost analysis\\u000a is considered

Nikos D. Lagaros



Sensitivity Analysis of Parallel Manipulators using an Interval Linearization Method  

E-print Network

Sensitivity Analysis of Parallel Manipulators using an Interval Linearization Method Mikhael linearization method for the sensitivity analysis of manipulators to variations in their geometric param- eters linearization method automatically detects such situations. Keywords: parallel manipulators, sensitivity

Paris-Sud XI, Université de


/sup 252/Cf-source-driven neutron noise analysis method  

SciTech Connect

The /sup 252/Cf-source-driven neutron noise analysis method has been tested in a wide variety of experiments that have indicated the broad range of applicability of the method. The neutron multiplication factor k/sub eff/ has been satisfactorily detemined for a variety of materials including uranium metal, light water reactor fuel pins, fissile solutions, fuel plates in water, and interacting cylinders. For a uranyl nitrate solution tank which is typical of a fuel processing or reprocessing plant, the k/sub eff/ values were satisfactorily determined for values between 0.92 and 0.5 using a simple point kinetics interpretation of the experimental data. The short measurement times, in several cases as low as 1 min, have shown that the development of this method can lead to a practical subcriticality monitor for many in-plant applications. The further development of the method will require experiments oriented toward particular applications including dynamic experiments and the development of theoretical methods to predict the experimental observables.

Mihalczo, J.T.; King, W.T.; Blakeman, E.D.



A method for polyvalent analysis of preventive motivations.  


The article presents a method for polyvalent analysis of preventive motivations (MPAPM). The method is applied by using the survey technique with a specially designed for the purpose questionnaire. Four types of data are gathered and analyzed: social-adaptive indicators, health status, attitude toward prevention and actual choice. MPAPM provides the opportunity to achieve fundamental scientific results in the study of the motivations underlying risk behaviour. Another application of the method is the detailed study of the theoretical and practical aspects of preventive motivations, both in the community and on an individual level--for evaluating the eventual success rate of a particular individual attending preventive programs. It is possible to analyze the potentials of prevention compared to alternative methods of improving one's health status. Juxtaposing conscious motives and actual behaviour, MPAPM enables the researcher to analyze the role of the unconscious aspects of preventive and risk behaviour. MPAPM employs new types of concepts which define the adaptive aspects of disease, health, social status and prevention. In its essence this is a method for marketing study of the polyvalent nature of the need for prevention (combined and concurrent interaction of social, psychological, epidemiological and biological factors). PMID:9575654

Sarov, G



The uniform asymptotic swallowtail approximation - Practical methods for oscillating integrals with four coalescing saddle points  

NASA Technical Reports Server (NTRS)

Methods that can be used in the numerical implementation of the uniform swallowtail approximation are described. An explicit expression for that approximation is presented to the lowest order, showing that there are three problems which must be overcome in practice before the approximation can be applied to any given problem. It is shown that a recently developed quadrature method can be used for the accurate numerical evaluation of the swallowtail canonical integral and its partial derivatives. Isometric plots of these are presented to illustrate some of their properties. The problem of obtaining the arguments of the swallowtail integral from an analytical function of its argument is considered, describing two methods of solving this problem. The asymptotic evaluation of the butterfly canonical integral is addressed.

Connor, J. N. L.; Curtis, P. R.; Farrelly, D.



Degradation of learned skills. Effectiveness of practice methods on simulated space flight skill retention  

NASA Technical Reports Server (NTRS)

Manual flight control and emergency procedure task skill degradation was evaluated after time intervals of from 1 to 6 months. The tasks were associated with a simulated launch through the orbit insertion flight phase of a space vehicle. The results showed that acceptable flight control performance was retained for 2 months, rapidly deteriorating thereafter by a factor of 1.7 to 3.1 depending on the performance measure used. Procedural task performance showed unacceptable degradation after only 1 month, and exceeded an order of magnitude after 4 months. The effectiveness of static rehearsal (checklists and briefings) and dynamic warmup (simulator practice) retraining methods were compared for the two tasks. Static rehearsal effectively countered procedural skill degradation, while some combination of dynamic warmup appeared necessary for flight control skill retention. It was apparent that these differences between methods were not solely a function of task type or retraining method, but were a function of the performance measures used for each task.

Sitterley, T. E.; Berge, W. A.



A high-efficiency aerothermoelastic analysis method  

NASA Astrophysics Data System (ADS)

In this paper, a high-efficiency aerothermoelastic analysis method based on unified hypersonic lifting surface theory is established. The method adopts a two-way coupling form that couples the structure, aerodynamic force, and aerodynamic thermo and heat conduction. The aerodynamic force is first calculated based on unified hypersonic lifting surface theory, and then the Eckert reference temperature method is used to solve the temperature field, where the transient heat conduction is solved using Fourier's law, and the modal method is used for the aeroelastic correction. Finally, flutter is analyzed based on the p-k method. The aerothermoelastic behavior of a typical hypersonic low-aspect ratio wing is then analyzed, and the results indicate the following: (1) the combined effects of the aerodynamic load and thermal load both deform the wing, which would increase if the flexibility, size, and flight time of the hypersonic aircraft increase; (2) the effect of heat accumulation should be noted, and therefore, the trajectory parameters should be considered in the design of hypersonic flight vehicles to avoid hazardous conditions, such as flutter.

Wan, ZhiQiang; Wang, YaoKun; Liu, YunZhen; Yang, Chao



How equity is addressed in clinical practice guidelines: a content analysis  

PubMed Central

Objectives Considering equity into guidelines presents methodological challenges. This study aims to qualitatively synthesise the methods for incorporating equity in clinical practice guidelines (CPGs). Setting Content analysis of methodological publications. Eligibility criteria for selecting studies Methodological publications were included if they provided checklists/frameworks on when, how and to what extent equity should be incorporated in CPGs. Data sources We electronically searched MEDLINE, retrieved references, and browsed guideline development organisation websites from inception to January 2013. After study selection by two authors, general characteristics and checklists items/framework components from included studies were extracted. Based on the questions or items from checklists/frameworks (unit of analysis), content analysis was conducted to identify themes and questions/items were grouped into these themes. Primary outcomes The primary outcomes were methodological themes and processes on how to address equity issues in guideline development. Results 8 studies with 10 publications were included from 3405 citations. In total, a list of 87 questions/items was generated from 17 checklists/frameworks. After content analysis, questions were grouped into eight themes (‘scoping questions’, ‘searching relevant evidence’, ‘appraising evidence and recommendations’, ‘formulating recommendations’, ‘monitoring implementation’, ‘providing a flow chart to include equity in CPGs’, and ‘others: reporting of guidelines and comments from stakeholders’ for CPG developers and ‘assessing the quality of CPGs’ for CPG users). Four included studies covered more than five of these themes. We also summarised the process of guideline development based on the themes mentioned above. Conclusions For disadvantaged population-specific CPGs, eight important methodological issues identified in this review should be considered when including equity in CPGs under the guidance of a scientific guideline development manual. PMID:25479795

Shi, Chunhu; Tian, Jinhui; Wang, Quan; Petkovic, Jennifer; Ren, Dan; Yang, Kehu; Yang, Yang



Primary prevention in general practice – views of German general practitioners: a mixed-methods study  

PubMed Central

Background Policy efforts focus on a reorientation of health care systems towards primary prevention. To guide such efforts, we analyzed the role of primary prevention in general practice and general practitioners’ (GPs) attitudes toward primary prevention. Methods Mixed-method study including a cross-sectional survey of all community-based GPs and focus groups in a sample of GPs who collaborated with the Institute of General Practice in Berlin, Germany in 2011. Of 1168 GPs 474 returned the mail survey. Fifteen GPs participated in focus group discussions. Survey and interview guidelines were developed and tested to assess and discuss beliefs, attitudes, and practices regarding primary prevention. Results Most respondents considered primary prevention within their realm of responsibility (70%). Primary prevention, especially physical activity, healthy eating, and smoking cessation, was part of the GPs’ health care recommendations if they thought it was indicated. Still a quarter of survey respondents discussed reduction of alcohol consumption with their patients infrequently even when they thought it was indicated. Similarly 18% claimed that they discuss smoking cessation only sometimes. The focus groups revealed that GPs were concerned about the detrimental effects an uninvited health behavior suggestion could have on patients and were hesitant to take on the role of “health policing”. GPs saw primary prevention as the responsibility of multiple actors in a network of societal and municipal institutions. Conclusions The mixed-method study showed that primary prevention approaches such as lifestyle counseling is not well established in primary care. GPs used a selective approach to offer preventive advice based upon indication. GPs had a strong sense that a universal prevention approach carried the potential to destroy a good patient-physician relationship. Other approaches to public health may be warranted such as a multisectoral approach to population health. This type of restructuring of the health care sector may benefit patients who are unable to afford specific prevention programmes and who have competing demands that hinder their ability to focus on behavior change. PMID:24885100



Best Practices for Finite Element Analysis of Spent Nuclear Fuel Transfer, Storage, and Transportation Systems  

SciTech Connect

Storage casks and transportation packages for spent nuclear fuel (SNF) are designed to confine SNF in sealed canisters or casks, provide structural integrity during accidents, and remove decay through a storage or transportation overpack. The transfer, storage, and transportation of SNF in dry storage casks and transport packages is regulated under 10 CFR Part 72 and 10 CFR Part 71, respectively. Finite Element Analysis (FEA) is used with increasing frequency in Safety Analysis Reports and other regulatory technical evaluations related to SNF casks and packages and their associated systems. Advances in computing power have made increasingly sophisticated FEA models more feasible, and as a result, the need for careful review of such models has also increased. This paper identifies best practice recommendations that stem from recent NRC review experience. The scope covers issues common to all commercially available FEA software, and the recommendations are applicable to any FEA software package. Three specific topics are addressed: general FEA practices, issues specific to thermal analyses, and issues specific to structural analyses. General FEA practices covers appropriate documentation of the model and results, which is important for an efficient review process. The thermal analysis best practices are related to cask analysis for steady state conditions and transient scenarios. The structural analysis best practices are related to the analysis of casks and associated payload during standard handling and drop scenarios. The best practices described in this paper are intended to identify FEA modeling issues and provide insights that can help minimize associated uncertainties and errors, in order to facilitate the NRC licensing review process.

Bajwa, Christopher S.; Piotter, Jason; Cuta, Judith M.; Adkins, Harold E.; Klymyshyn, Nicholas A.; Fort, James A.; Suffield, Sarah R.



International Commercial Remote Sensing Practices and Policies: A Comparative Analysis  

NASA Astrophysics Data System (ADS)

In recent years, there has been much discussion about U.S. commercial remoteUnder the Act, the Secretary of Commerce sensing policies and how effectively theylicenses the operations of private U.S. address U.S. national security, foreignremote sensing satellite systems, in policy, commercial, and public interests.consultation with the Secretaries of Defense, This paper will provide an overview of U.S.State, and Interior. PDD-23 provided further commercial remote sensing laws,details concerning the operation of advanced regulations, and policies, and describe recentsystems, as well as criteria for the export of NOAA initiatives. It will also addressturnkey systems and/or components. In July related foreign practices, and the overall2000, pursuant to the authority delegated to legal context for trade and investment in thisit by the Secretary of Commerce, NOAA critical industry.iss ued new regulations for the industry. Licensing and Regulationsatellite systems. NOAA's program is The 1992 Land Remote Sensing Policy Act ("the Act"), and the 1994 policy on Foreign Access to Remote Sensing Space Capabilities (known as Presidential Decision Directive-23, or PDD-23) put into place an ambitious legal and policy framework for the U.S. Government's licensing of privately-owned, high-resolution satellite systems. Previously, capabilities afforded national security and observes the international obligations of the United States; maintain positive control of spacecraft operations; maintain a tasking record in conjunction with other record-keeping requirements; provide U.S. Government access to and use of data when required for national security or foreign policy purposes; provide for U.S. Government review of all significant foreign agreements; obtain U.S. Government approval for any encryption devices used; make available unenhanced data to a "sensed state" as soon as such data are available and on reasonable cost terms and conditions; make available unenhanced data as requested by the U.S. Government Archive; and, obtain a priori U.S. Government approval of all plans and procedures to deal with safe disposition of the satellite. Further information on NOAA's regulations and NOAA's licensing program is available at Monitoring and Enforcement NOAA's enforcement mission is focused on the legislative mandate which states that the Secretary of Commerce has a continuing obligation to ensure that licensed imaging systems are operated lawfully to preserve the national security and foreign policies of the United States. NOAA has constructed an end-to-end monitoring and compliance program to review the activities of licensed companies. This program includes a pre- launch review, an operational baseline audit, and an annual comprehensive national security audit. If at any time there is suspicion or concern that a system is being operated unlawfully, a no-notice inspection may be initiated. setbacks, three U.S. companies are now operational, with more firms expected to become so in the future. While NOAA does not disclose specific systems capabilities for proprietary reasons, its current licensing resolution thresholds for general commercial availability are as follows: 0.5 meter Ground Sample Distance (GSD) for panchromatic systems, 2 meter GSD for multi-spectral systems, 3 meter Impulse Response (IPR) for Synthetic Aperture Radar systems, and 20 meter GSD for hyperspectral systems (with certain 8-meter hyperspectral derived products also licensed for commercial distribution). These thresholds are subject to change based upon foreign availability and other considerations. It should also be noted that license applications are reviewed and granted on a case-by-case basis, pursuant to each system's technology and concept of operations. In 2001, NOAA, along with the Department of Commerce's International Trade Administration, commissioned a study by the RAND Corporation to assess the risks faced by the U.S. commercial remote sensing satellite industry. In commissioning this study, NOAA's goal was to bette

Stryker, Timothy


Thermal Analysis Methods for Aerobraking Heating  

NASA Technical Reports Server (NTRS)

As NASA begins exploration of other planets, a method of non-propulsively slowing vehicles at the planet, aerobraking, may become a valuable technique for managing vehicle design mass and propellant. An example of this is Mars Reconnaissance Orbiter (MRO), which will launch in late 2005 and reach Mars in March of 2006. In order to save propellant, MRO will use aerobraking to modify the initial orbit at Mars. The spacecraft will dip into the atmosphere briefly on each orbit, and during the drag pass, the atmospheric drag on the spacecraft will slow it, thus lowering the orbit apoapsis. The largest area on the spacecraft, and that most affected by the heat generated during the aerobraking process, is the solar arrays. A thermal analysis of the solar arrays was conducted at NASA Langley, to simulate their performance throughout the entire roughly 6-month period of aerobraking. Several interesting methods were used to make this analysis more rapid and robust. Two separate models were built for this analysis, one in Thermal Desktop for radiation and orbital heating analysis, and one in MSC.Patran for thermal analysis. The results from the radiation model were mapped in an automated fashion to the Patran thermal model that was used to analyze the thermal behavior during the drag pass. A high degree of automation in file manipulation as well as other methods for reducing run time were employed, since toward the end of the aerobraking period the orbit period is short, and in order to support flight operations the runs must be computed rapidly. All heating within the Patran Thermal model was combined in one section of logic, such that data mapped from the radiation model and aeroheating model, as well as skin temperature effects on the aeroheating and surface radiation, could be incorporated easily. This approach calculates the aeroheating at any given node, based on its position and temperature as well as the density and velocity at that trajectory point. Run times on several different processors, computer hard drives, and operating systems (Windows versus Linux) were evaluated.

Amundsen, Ruth M.; Gasbarre, Joseph F.; Dec, John A.



Scanning methods applied to bitemark analysis  

NASA Astrophysics Data System (ADS)

The 2009 National Academy of Sciences report on forensics focused criticism on pattern evidence subdisciplines in which statements of unique identity are utilized. One principle of bitemark analysis is that the human dentition is unique to the extent that a perpetrator may be identified based on dental traits in a bitemark. Optical and electron scanning methods were used to measure dental minutia and to investigate replication of detail in human skin. Results indicated that being a visco-elastic substrate, skin effectively reduces the resolution of measurement of dental detail. Conclusions indicate caution in individualization statements.

Bush, Peter J.; Bush, Mary A.



Apparatus And Method For Fluid Analysis  


The present invention is an apparatus and method for analyzing a fluid used in a machine or in an industrial process line. The apparatus has at least one meter placed proximate the machine or process line and in contact with the machine or process fluid for measuring at least one parameter related to the fluid. The at least one parameter is a standard laboratory analysis parameter. The at least one meter includes but is not limited to viscometer, element meter, optical meter, particulate meter, and combinations thereof.

Wilson, Bary W. (Richland, WA); Peters, Timothy J. (Richland, WA); Shepard, Chester L. (West Richland, WA); Reeves, James H. (Richland, WA)



[Proteomic methods of protein separation and analysis].  


The review briefly presents principles and stages of execution of such highly effective methods of protein separation as capillary, 2D gel-electrophoresis, liquid chromatography, the possibility of their successful combination with tandem mass-spectrometry and application in various fields of proteomics. The main problems of proteomic analysis as well as ways of solving by using mass-spectrometry are examined. 2 main approaches during protein identification are described in the review, characteristics and possibilities of various top-down and bottom-up proteomic analytical programs are provided. PMID:25286518

Polunina, T A; Varshavskaia, Iu S; Grigor'eva, G V; Krasnov, Iu M



Method and apparatus for chromatographic quantitative analysis  


An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

Fritz, James S. (Ames, IA); Gjerde, Douglas T. (Ames, IA); Schmuckler, Gabriella (Haifa, IL)



Dynamic analysis methods for nuclear facilities  

SciTech Connect

A comparison is made between three different dynamic analysis methods commonly used in the analysis of nuclear facilities. The methods are applied to a typical non-reactor type nuclear facility; namely, an early configuration of the High Performance Fuel Laboratory which was to have been designed and constructed to house an automated fuel process line on the Hanford Reservation near Richland, Washington. The fuel to be handled was mixed plutonium and uranium in powder and pellet form which, therefore, required design for severe earthquake and tornado conditions. The structure is a two-story reinforced concrete shear wall building with a high bay on one end. The comparison is made for earthquake motion in the lateral horizontal direction only. The first method employs a three degree of freedom spring mass system with the masses lumped at the three floor and roof slab levels. After shears are obtained they are distributed to the shear walls in proportion to their stiffnesses. Floor and roof slabs are assumed rigid but eccentricities are accounted for in the shear distribution. The second method utilizes a pseudo three-dimensional stick model. The shear walls and horizontal floor and roof diaphram are modeled as three dimensional beam elements using the SAP IV computer Code. All nodal points are in the X, Y plane (Z-0) but motions in this plane are restricted with unrestricted translation in the Z direction. This enables shears to be obtained in all the walls and diaphrams without resorting to lumping walls together and in addition, automatically accounts for eccentricities in the direction being considered. The third and last model is a three dimensional finite element model. All walls and diaphrams are modeled using plane stress quadrilateral membrane elements again using the SAP IV Computer Code.

Horsager, B. K.



Prescription Analysis of Pediatric Outpatient Practice in Nagpur City  

PubMed Central

Background: Medication errors are probably one of the most common types of medical errors, as medication is the most common health-care intervention. Knowing where and when errors are most likely to occur is generally felt to be the first step in trying to prevent these errors. Objective: To study prescribing patterns and errors in pediatric OPD prescriptions presenting to four community pharmacies across Nagpur city and to compare the prescription error rates across prescriber profiles. Materials and Methods: The study sample included 1376 valid pediatric OPD prescriptions presenting to four randomly selected community pharmacies in Nagpur, collected over a period of 2 months. Confirmed errors in the prescriptions were reviewed and analyzed. The core indicators for drug utilization studies, mentioned by WHO, were used to define errors. Results: The 1376 prescriptions included in the study were for a total of 3435 drugs, prescribed by 41 doctors. Fixed dose formulations dominated the prescribing pattern, many of which were irrational. Prescribing by market name was almost universal and generic prescriptions were for merely 254 (7.4%) drugs. The prescribing pattern also indicated polypharmacy with the average number of drugs per encounter of 2.5. Antibiotics were included in 1087 (79%) prescriptions, while injectable drugs were prescribed in 22 (1.6%) prescriptions. The prescription error score varied significantly across prescriber profiles. Conclusion: The findings of our study highlight the continuing crisis of the irrational drug prescribing in the country. PMID:20606924

Pandey, Anuja A; Thakre, Subhash B; Bhatkule, Prakash R



Critical analysis of biomarkers in the current periodontal practice  

PubMed Central

Periodontal disease is a chronic microbial infection that triggers inflammation-mediated loss of the periodontal ligament and alveolar bone that supports the teeth. Because of the increasing prevalence and associated comorbidities, there is a need for the development of new diagnostic tests that can detect the presence of active disease, predict future disease progression, and evaluate the response to periodontal therapy, thereby improving the clinical management of periodontal patients. The diagnosis of active phases of periodontal disease and the identification of patients at risk for active disease represent challenges for clinical investigators and practitioners. Advances in diagnostic research are moving toward methods whereby the periodontal risk can be identified and quantified by objective measures using biomarkers. Patients with periodontitis may have elevated circulating levels of specific inflammatory markers that can be correlated to the severity of the disease. Advances in the use of oral fluids as possible biological samples for objective measures of the current disease state, treatment monitoring, and prognostic indicators have boosted saliva- and other oral-based fluids to the forefront of technology. Gingival crevicular fluid (GCF) is an inflammatory exudate that can be collected at the gingival margin or within the gingival crevice. This article highlights recent advances in the use of biomarker-based disease diagnostics that focus on the identification of active periodontal disease from plaque biofilms, GCF, and saliva. PMID:21976831

Khiste, Sujeet V.; Ranganath, V.; Nichani, Ashish S.; Rajani, V.



Comparative analysis of the methods for SADT determination.  


The self-accelerating decomposition temperature (SADT) is an important parameter that characterizes thermal safety at transport of self-reactive substances. A great many articles were published focusing on various methodological aspects of SADT determination. Nevertheless there remain several serious problems that require further analysis and solution. Some of them are considered in the paper. Firstly four methods suggested by the United Nations "Recommendations on the Transport of Dangerous Goods" (TDG) are surveyed in order to reveal their features and limitations. The inconsistency between two definitions of SADT is discussed afterwards. One definition is the basis for the US SADT test and the heat accumulation storage test (Dewar test), another one is used when the Adiabatic storage test or the Isothermal storage test are applied. It is shown that this inconsistency may result in getting different and, in some cases, unsafe estimates of SADT. Then the applicability of the Dewar test for determination of SADT for solids is considered. It is shown that this test can be restrictedly applied for solids provided that the appropriate scale-up procedure is available. The advanced method based on the theory of regular cooling mode is proposed, which ensures more reliable results of the Dewar test application. The last part of the paper demonstrates how the kinetics-based simulation method helps in evaluation of SADT in those complex but practical cases (in particular, stack of packagings) when neither of the methods recommended by TDG can be used. PMID:16889892

Kossoy, A A; Sheinman, I Ya



Deliberate Practice and Performance in Music, Games, Sports, Education, and Professions: A Meta-Analysis.  


More than 20 years ago, researchers proposed that individual differences in performance in such domains as music, sports, and games largely reflect individual differences in amount of deliberate practice, which was defined as engagement in structured activities created specifically to improve performance in a domain. This view is a frequent topic of popular-science writing-but is it supported by empirical evidence? To answer this question, we conducted a meta-analysis covering all major domains in which deliberate practice has been investigated. We found that deliberate practice explained 26% of the variance in performance for games, 21% for music, 18% for sports, 4% for education, and less than 1% for professions. We conclude that deliberate practice is important, but not as important as has been argued. PMID:24986855

Macnamara, Brooke N; Hambrick, David Z; Oswald, Frederick L



A method for obtaining practical flutter-suppression control laws using results of optimal control theory  

NASA Technical Reports Server (NTRS)

The results of optimal control theory are used to synthesize a feedback filter. The feedback filter is used to force the output of the filtered frequency response to match that of a desired optimal frequency response over a finite frequency range. This matching is accomplished by employing a nonlinear programing algorithm to search for the coefficients of the feedback filter that minimize the error between the optimal frequency response and the filtered frequency response. The method is applied to the synthesis of an active flutter-suppression control law for an aeroelastic wind-tunnel model. It is shown that the resulting control law suppresses flutter over a wide range of subsonic Mach numbers. This is a promising method for synthesizing practical control laws using the results of optimal control theory.

Newson, J. R.



Assessing Scientific Practices Using Machine-Learning Methods: How Closely Do They Match Clinical Interview Performance?  

NASA Astrophysics Data System (ADS)

The landscape of science education is being transformed by the new Framework for Science Education (National Research Council, A framework for K-12 science education: practices, crosscutting concepts, and core ideas. The National Academies Press, Washington, DC, 2012), which emphasizes the centrality of scientific practices—such as explanation, argumentation, and communication—in science teaching, learning, and assessment. A major challenge facing the field of science education is developing assessment tools that are capable of validly and efficiently evaluating these practices. Our study examined the efficacy of a free, open-source machine-learning tool for evaluating the quality of students' written explanations of the causes of evolutionary change relative to three other approaches: (1) human-scored written explanations, (2) a multiple-choice test, and (3) clinical oral interviews. A large sample of undergraduates (n = 104) exposed to varying amounts of evolution content completed all three assessments: a clinical oral interview, a written open-response assessment, and a multiple-choice test. Rasch analysis was used to compute linear person measures and linear item measures on a single logit scale. We found that the multiple-choice test displayed poor person and item fit (mean square outfit >1.3), while both oral interview measures and computer-generated written response measures exhibited acceptable fit (average mean square outfit for interview: person 0.97, item 0.97; computer: person 1.03, item 1.06). Multiple-choice test measures were more weakly associated with interview measures (r = 0.35) than the computer-scored explanation measures (r = 0.63). Overall, Rasch analysis indicated that computer-scored written explanation measures (1) have the strongest correspondence to oral interview measures; (2) are capable of capturing students' normative scientific and naive ideas as accurately as human-scored explanations, and (3) more validly detect understanding than the multiple-choice assessment. These findings demonstrate the great potential of machine-learning tools for assessing key scientific practices highlighted in the new Framework for Science Education.

Beggrow, Elizabeth P.; Ha, Minsu; Nehm, Ross H.; Pearl, Dennis; Boone, William J.



A concise method for mine soils analysis  

SciTech Connect

A large number of abandoned hard rock mines exist in Colorado and other mountain west states, many on public property. Public pressure and resulting policy changes have become a driving force in the reclamation of these sites. Two of the key reclamation issues for these sites in the occurrence of acid forming materials (AFMs) in mine soils, and acid mine drainage (AMD) issuing from mine audits. An AMD treatment system design project for the Forest Queen mine in Colorado's San Juan mountains raised the need for a simple, useable method for analysis of mine land soils, both for suitability as a construction material, and to determine the AFM content and potential for acid release. The authors have developed a simple, stepwise, go - no go test for the analysis of mine soils. Samples were collected from a variety of sites in the Silverton, CO area, and subjected to three tiers of tests including: paste pH, Eh, and 10% HCl fizz test; then total digestion in HNO{sub 3}/HCl, neutralization potential, exposure to meteoric water, and toxicity content leaching procedure (TCLP). All elemental analyses were performed with an inductively-coupled plasma (ICP) spectrometer. Elimination of samples via the first two testing tiers left two remaining samples, which were subsequently subjected to column and sequential batch tests, with further elemental analysis by ICP. Based on these tests, one sample was chosen for suitability as a constructing material for the Forest Queen treatment system basins. Further simplification, and testing on two pairs of independent soil samples, has resulted in a final analytical method suitable for general use.

Winkler, S.; Wildeman, T.; Robinson, R.; Herron, J.



Green analytical method development for statin analysis.  


Green analytical chemistry method was developed for pravastatin, fluvastatin and atorvastatin analysis. HPLC/DAD method using ethanol-based mobile phase with octadecyl-grafted silica with various grafting and related-column parameters such as particle sizes, core-shell and monolith was studied. Retention, efficiency and detector linearity were optimized. Even for column with particle size under 2?m, the benefit of keeping efficiency within a large range of flow rate was not obtained with ethanol based mobile phase compared to acetonitrile one. Therefore the strategy to shorten analysis by increasing the flow rate induced decrease of efficiency with ethanol based mobile phase. An ODS-AQ YMC column, 50mm×4.6mm, 3?m was selected which showed the best compromise between analysis time, statin separation, and efficiency. HPLC conditions were at 1mL/min, ethanol/formic acid (pH 2.5, 25mM) (50:50, v/v) and thermostated at 40°C. To reduce solvent consumption for sample preparation, 0.5mg/mL concentration of each statin was found the highest which respected detector linearity. These conditions were validated for each statin for content determination in high concentrated hydro-alcoholic solutions. Solubility higher than 100mg/mL was found for pravastatin and fluvastatin, whereas for atorvastatin calcium salt the maximum concentration was 2mg/mL for hydro-alcoholic binary mixtures between 35% and 55% of ethanol in water. Using atorvastatin instead of its calcium salt, solubility was improved. Highly concentrated solution of statins offered potential fluid for per Buccal Per-Mucous(®) administration with the advantages of rapid and easy passage of drugs. PMID:25582487

Assassi, Amira Louiza; Roy, Claude-Eric; Perovitch, Philippe; Auzerie, Jack; Hamon, Tiphaine; Gaudin, Karen



Analysis of methods. [information systems evolution environment  

NASA Technical Reports Server (NTRS)

Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity Relationship, Data Flow Diagrams, and Structure Charts, for inclusion in an integrated development support environment.

Mayer, Richard J. (editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.



A 5-Year Analysis of Peer-Reviewed Journal Article Publications of Pharmacy Practice Faculty Members  

PubMed Central

Objectives. To evaluate scholarship, as represented by peer-reviewed journal articles, among US pharmacy practice faculty members; contribute evidence that may better inform benchmarking by academic pharmacy practice departments; and examine factors that may be related to publication rates. Methods. Journal articles published by all pharmacy practice faculty members between January 1, 2006, and December 31, 2010, were identified. College and school publication rates were compared based on public vs. private status, being part of a health science campus, having a graduate program, and having doctor of pharmacy (PharmD) faculty members funded by the National Institutes of Health (NIH). Results. Pharmacy practice faculty members published 6,101 articles during the 5-year study period, and a pharmacy practice faculty member was the primary author on 2,698 of the articles. Pharmacy practice faculty members published an average of 0.51 articles per year. Pharmacy colleges and schools affiliated with health science campuses, at public institutions, with NIH-funded PharmD faculty members, and with graduate programs had significantly higher total publication rates compared with those that did not have these characteristics (p<0.006). Conclusion. Pharmacy practice faculty members contributed nearly 6,000 unique publications over the 5-year period studied. However, this reflects a rate of less than 1 publication per faculty member per year, suggesting that a limited number of faculty members produced the majority of publications. PMID:23049099

Spivey, Christina; Martin, Jennifer R.; Wyles, Christina; Ehrman, Clara; Schlesselman, Lauren S.



The experience of implementing evidence-based practice change: a qualitative analysis.  


The Oncology Nursing Society (ONS) and ONS Foundation worked together to develop the Institute for Evidence-Based Practice Change (IEBPC) program to facilitate the implementation of evidence-based practice (EBP) change in nursing. This analysis describes the experience of 19 teams of nurses from various healthcare settings who participated in the IEBPC program. Qualitative analysis of verbatim narratives of activities and observations during the process of implementing an EBP project was used to identify key themes in the experience. EBP implementation enabled participants to learn about their own practice and to experience empowerment through the evidence, and it ignited the spirit of inquiry, team work, and multidisciplinary collaboration. Experiences and lessons learned from nurses implementing EBP can be useful to others in planning EBP implementation. PMID:24080054

Irwin, Margaret M; Bergman, Rosalie M; Richards, Rebecca



Communication: Quantum polarized fluctuating charge model: a practical method to include ligand polarizability in biomolecular simulations.  


We present a simple and practical method to include ligand electronic polarization in molecular dynamics (MD) simulation of biomolecular systems. The method involves periodically spawning quantum mechanical (QM) electrostatic potential (ESP) calculations on an extra set of computer processors using molecular coordinate snapshots from a running parallel MD simulation. The QM ESPs are evaluated for the small-molecule ligand in the presence of the electric field induced by the protein, solvent, and ion charges within the MD snapshot. Partial charges on ligand atom centers are fit through the multi-conformer restrained electrostatic potential (RESP) fit method on several successive ESPs. The RESP method was selected since it produces charges consistent with the AMBER/GAFF force-field used in the simulations. The updated charges are introduced back into the running simulation when the next snapshot is saved. The result is a simulation whose ligand partial charges continuously respond in real-time to the short-term mean electrostatic field of the evolving environment without incurring additional wall-clock time. We show that (1) by incorporating the cost of polarization back into the potential energy of the MD simulation, the algorithm conserves energy when run in the microcanonical ensemble and (2) the mean solvation free energies for 15 neutral amino acid side chains calculated with the quantum polarized fluctuating charge method and thermodynamic integration agree better with experiment relative to the Amber fixed charge force-field. PMID:22191857

Kimura, S Roy; Rajamani, Ramkumar; Langley, David R



Analysis using surface wave methods to detect shallow manmade tunnels  

NASA Astrophysics Data System (ADS)

Multi-method seismic surface wave approach was used to locate and estimate the dimensions of shallow horizontally-oriented cylindrical voids or manmade tunnels. The primary analytical methods employed were Attenuation Analysis of Rayleigh Waves (AARW), Surface Wave Common Offset (SWCO), and Spiking Filter (SF). Surface wave data were acquired at six study sites using a towed 24-channel land streamer and elastic-band accelerated weight-drop seismic source. Each site was underlain by one tunnel, nominally 1 meter in diameter and depth. The acquired surface wave data were analyzed automatically. Then interpretations compared to the field measurements to ascertain the degree of accuracy. The purpose of this research is to analyze the field response of Rayleigh waves to the presence of shallow tunnels. The SF technique used the variation of seismic signal response along a geophone array to determine void presence in the subsurface. The AARW technique was expanded for practical application, as suggested by Nasseri (2006), in order to indirectly estimate void location using a Normalized Energy Distance (NED) parameter for vertical tunnel dimension measurements and normalized Cumulative Logarithmic Decrement (CALD) values for horizontal tunnel dimension measurements. Confidence in tunnel detects is presented as a measure of NED signal strength. Conversely, false positives are reduced by AARW through analysis of sub-array data. The development of such estimations is a promising tool for engineers that require quantitative measurements of manmade tunnels in the shallow subsurface.

Putnam, Niklas Henry


Skinner Meets Piaget on the Reggio Playground: Practical Synthesis of Applied Behavior Analysis and Developmentally Appropriate Practice Orientations  

ERIC Educational Resources Information Center

We focus on integrating developmentally appropriate practices, the project approach of Reggio Emilia, and a behavior analytic model to support a quality preschool environment. While the above practices often are considered incompatible, we have found substantial overlap and room for integration of these perspectives in practical application. With…

Warash, Bobbie; Curtis, Reagan; Hursh, Dan; Tucci, Vicci



A Quantitative Analysis and Natural History of B. F. Skinner's Coauthoring Practices  

PubMed Central

This paper describes and analyzes B. F. Skinner's coauthoring practices. After identifying his 35 coauthored publications and 27 coauthors, we analyze his coauthored works by their form (e.g., journal articles) and kind (e.g., empirical); identify the journals in which he published and their type (e.g., data-type); describe his overall and local rates of publishing with his coauthors (e.g., noting breaks in the latter); and compare his coauthoring practices with his single-authoring practices (e.g., form, kind, journal type) and with those in the scientometric literature (e.g., majority of coauthored publications are empirical). We address these findings in the context of describing the natural history of Skinner's coauthoring practices. Finally, we describe some limitations in our methods and offer suggestions for future research. PMID:22532732

McKerchar, Todd L; Morris, Edward K; Smith, Nathaniel G



A practical comparison of methods for detecting transcription factor binding sites in ChIP-seq experiments  

PubMed Central

Background Chromatin immunoprecipitation coupled with massively parallel sequencing (ChIP-seq) is increasingly being applied to study transcriptional regulation on a genome-wide scale. While numerous algorithms have recently been proposed for analysing the large ChIP-seq datasets, their relative merits and potential limitations remain unclear in practical applications. Results The present study compares the state-of-the-art algorithms for detecting transcription factor binding sites in four diverse ChIP-seq datasets under a variety of practical research settings. First, we demonstrate how the biological conclusions may change dramatically when the different algorithms are applied. The reproducibility across biological replicates is then investigated as an internal validation of the detections. Finally, the predicted binding sites with each method are compared to high-scoring binding motifs as well as binding regions confirmed in independent qPCR experiments. Conclusions In general, our results indicate that the optimal choice of the computational approach depends heavily on the dataset under analysis. In addition to revealing valuable information to the users of this technology about the characteristics of the binding site detection approaches, the systematic evaluation framework provides also a useful reference to the developers of improved algorithms for ChIP-seq data. PMID:20017957



Undergraduate physiotherapy students' competencies, attitudes and perceptions after integrated educational pathways in evidence-based practice: a mixed methods study.  


This mixed methods study aimed to explore perceptions/attitudes, to evaluate knowledge/ skills, to investigate clinical behaviours of undergraduate physiotherapy students exposed to a composite education curriculum on evidence-based practice (EBP). Students' knowledge and skills were assessed before and after integrated learning activities, using the Adapted Fresno test, whereas their behaviour in EBP was evaluated by examining their internship documentation. Students' perceptions and attitudes were explored through four focus groups. Sixty-two students agreed to participate in the study. The within group mean differences (A-Fresno test) were 34.2 (95% CI 24.4 to 43.9) in the first year and 35.1 (95% CI 23.2 to 47.1) in the second year; no statistically significant change was observed in the third year. Seventy-six percent of the second year and 88% of the third year students reached the pass score. Internship documentation gave evidence of PICOs and database searches (95-100%), critical appraisal of internal validity (25-75%) but not of external validity (5-15%). The correct application of these items ranged from 30 to 100%. Qualitative analysis of the focus groups indicated students valued EBP, but perceived many barriers, with clinicians being both an obstacle and a model. Key elements for changing students' behaviours seem to be internship environment and possibility of continuous practice and feedback. PMID:24766584

Bozzolan, M; Simoni, G; Balboni, M; Fiorini, F; Bombardi, S; Bertin, N; Da Roit, M



Methods for spectral image analysis by exploiting spatial simplicity  


Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

Keenan, Michael R. (Albuquerque, NM)



Peering inside the Clock: Using Success Case Method to Determine How and Why Practice-Based Educational Interventions Succeed  

ERIC Educational Resources Information Center

Introduction: No educational method or combination of methods will facilitate implementation of clinical practice guidelines in all clinical contexts. To develop an empirical basis for aligning methods to contexts, we need to move beyond "Does it work?" to also ask "What works for whom and under what conditions?" This study employed Success Case…

Olson, Curtis A.; Shershneva, Marianna B.; Brownstein, Michelle Horowitz



inria-00000421,version1-11Oct2005 Practical Semantic Analysis of Web Sites and Documents  

E-print Network

inria-00000421,version1-11Oct2005 Practical Semantic Analysis of Web Sites and Documents Thierry make a parallel between programs and Web sites. We present some examples of semantic constraints semantics, logic programming, web sites, in- formation system, knowledge management, content management

Paris-Sud XI, Université de


Identifying Evidence-Based Practices in Special Education through High Quality Meta-Analysis  

ERIC Educational Resources Information Center

The purpose of this study was to determine if meta-analysis can be used to enhance efforts to identify evidence-based practices (EBPs). In this study, the quality of included studies acted as the moderating variable. I used the quality indicators for experimental and quasi-experimental research developed by Gersten, Fuchs, Coyne, Greenwood, and…

Friedt, Brian



Abstract: Using System Dynamics Analysis for Evaluating the Sustainability of “Complete Streets” Practices  

EPA Science Inventory

Abstract: Using System Dynamics Analysis for Evaluating the Sustainability of “Complete Streets” Practices Primary Author: Nicholas R. Flanders 109 T.W. Alexander Drive Mail Code: E343-02 Research Triangle Park, NC 27709 919-541-3660 Topic cat...


Hotel companies' environmental policies and practices: a content analysis of their web pages  

Microsoft Academic Search

Purpose – The purpose of this paper is to analyze the environmental management policies and practices of the top 50 hotel companies as disclosed on their corporate web sites. Design\\/methodology\\/approach – This study employed content analysis to review the web sites of the top 50 hotel companies as defined herein. Findings – Only 46 per cent of the selected hotel



"Contradictory semiotic analysis": to bridge literacies and activities in Design practice  

E-print Network

1 "Contradictory semiotic analysis": to bridge literacies and activities in Design practice Annie and anticipate multiple semiotic literacies when they bridge from real life situation to a mediated situation the atmosphere of a class. To address this issue, they deployed a specific semiotic methodology that we called

Paris-Sud XI, Université de


Homotopy analysis method for quadratic Riccati differential equation  

Microsoft Academic Search

In this paper, the quadratic Riccati differential equation is solved by means of an analytic technique, namely the homotopy analysis method (HAM). Comparisons are made between Adomian’s decomposition method (ADM), homotopy perturbation method (HPM) and the exact solution and the homotopy analysis method. The results reveal that the proposed method is very effective and simple.

Yue Tan; Saeid Abbasbandy



Practice Makes Perfect: Improving Students' Skills in Understanding and Avoiding Plagiarism with a Themed Methods Course  

ERIC Educational Resources Information Center

To address the issue of plagiarism, students in two undergraduate Research Methods and Analysis courses conducted, analyzed, and wrote up original research on the topic of plagiarism. Students in an otherwise identical course completed the same assignments but examined a different research topic. At the start and end of the semester, all students…

Estow, Sarah; Lawrence, Eva K.; Adams, Kathrynn A.



Analysis of an inquiry-oriented inservice program in affecting science teaching practices  

NASA Astrophysics Data System (ADS)

This study was an examination of how science teachers' teaching abilities---content and pedagogical knowledge and skills---were affected by an inquiry-oriented science education professional development program. The study researched the characteristics of an inservice program, Microcosmos, designed to equip teachers with new perspectives on how to stimulate students' learning and to promote a self-reflective approach for the implementation of instructional practices leading to improving teachers' and students' roles in the science classroom. The Microcosmos Inservice Program, which focused on the use of microorganisms as a vehicle to teach science for middle and high school grades, was funded by the National Science Foundation and developed by the Microcosmos Project based at the School of Education, Boston University. The teacher-training program had as its main objective to show teachers and other educators how the smallest life forms---the microbes---can be a usable and dynamic way to stimulate science interest in students of all ages. It combines and integrates a number of training components that appear to be consistent with the recommendations listed in the major reform initiatives. The goal of the study was to explore weather the program provoked any change(s) in the pedagogical practices of teachers over time, and if these changes fostered inquiry-based practices in the classroom. The exploratory analysis used a qualitative methodology that followed a longitudinal design for the collection of the data gathered from a sample of 31 participants. The data was collected in two phases. Phase One - The Case History group, involved 5 science teachers over a period of seven years. Phase Two - The Expanded Teacher sample, involved 26 teachers---22 new teachers plus four teachers from Phase One---contacted at two different points on time during the study. Multiple data sources allowed for the collection of a varied and rigorous set of data for each individual in the sample. The primary data source was semi-structured interviews. Secondary data sources included pre- and post- on-site visits, classroom observations, teacher's self-report protocols and questionnaires, and documents and examples of teacher-work developed during the inservice training. The data was examined for evidence of change on: teachers' self-reported content-specific gains, teachers'self-reported and observed changes in their teaching methods and approach to curriculum, and the teachers' self-reported and observed changes in classroom practices as a result of the content and the pedagogy acting together and supplementing each other. A major finding of the study confirmed the benefits of inservice activities with an integral focus of science content and pedagogy on enhancing teachers' approach to instruction. The findings give renewed emphasis to the importance that inquiry-based practices for working with teachers, combined with a specific subject-matter focus, have in designing effective professional development. This combined approach, in some instances, contributed to important gains in the pedagogical content knowledge that teachers needed in order to effectively implement the Microcosmos learning experiences.

Santamaria Makang, Doris


Reforming High School Science for Low-Performing Students Using Inquiry Methods and Communities of Practice  

NASA Astrophysics Data System (ADS)

Some schools fall short of the high demand to increase science scores on state exams because low-performing students enter high school unprepared for high school science. Low-performing students are not successful in high school for many reasons. However, using inquiry methods have improved students' understanding of science concepts. The purpose of this qualitative research study was to investigate the teachers' lived experiences with using inquiry methods to motivate low-performing high school science students in an inquiry-based program called Xtreem Science. Fifteen teachers were selected from the Xtreem Science program, a program designed to assist teachers in motivating struggling science students. The research questions involved understanding (a) teachers' experiences in using inquiry methods, (b) challenges teachers face in using inquiry methods, and (c) how teachers describe student's response to inquiry methods. Strategy of data collection and analysis included capturing and understanding the teachers' feelings, perceptions, and attitudes in their lived experience of teaching using inquiry method and their experience in motivating struggling students. Analysis of interview responses revealed teachers had some good experiences with inquiry and expressed that inquiry impacted their teaching style and approach to topics, and students felt that using inquiry methods impacted student learning for the better. Inquiry gave low-performing students opportunities to catch up and learn information that moved them to the next level of science courses. Implications for positive social change include providing teachers and school district leaders with information to help improve performance of the low performing science students.

Bolden, Marsha Gail


New methods for sensitivity analysis of chaotic dynamical systems  

E-print Network

Computational methods for sensitivity analysis are invaluable tools for fluid dynamics research and engineering design. These methods are used in many applications, including aerodynamic shape optimization and adaptive ...

Blonigan, Patrick Joseph



'Understanding' as a Practical Issue in Sexual Health Education for People With Intellectual Disabilities: A Study Using Two Qualitative Methods.  


Objective: Sexual health education is important in addressing the health and social inequalities faced by people with intellectual disabilities. However, provision of health-related advice and education to people with various types and degrees of linguistic and learning difficulties involves addressing complex issues of language and comprehension. This article reports an exploratory study using 2 qualitative methods to examine the delivery of sexual health education to people with intellectual disabilities. Methods: Four video-recordings of sexual health education sessions were collected. Conversation analysis was used to examine in detail how such education occurs as a series of interactions between educators and learners. Interviews with 4 educators were carried out and analyzed using thematic analysis. Results: The analysis shows how educators anticipate problems of comprehension and how they respond when there is evidence that a person does not understand the activity or the educational message. This occurs particularly when verbal prompts involve long sentences and abstract concepts. We show a characteristic pattern that arises in these situations, in which both educator and learner jointly produce a superficially correct response. Conclusions: Although interviews allows us some insight into contextual issues, strategy, and aspects of sexual health education that occur outside of the actual teaching sessions, analysis of actual interactions can show us patterns that occur in interactions between educators and learners when comprehension is in question. Addressing how sexual health education is delivered in practice and in detail provides valuable lessons about how such education can be improved. (PsycINFO Database Record (c) 2014 APA, all rights reserved). PMID:25150539

Finlay, W M L; Rohleder, Poul; Taylor, Natalie; Culfear, Hollie



Analysis of two methods to evaluate antioxidants.  


This exercise is intended to introduce undergraduate biochemistry students to the analysis of antioxidants as a biotechnological tool. In addition, some statistical resources will also be used and discussed. Antioxidants play an important metabolic role, preventing oxidative stress-mediated cell and tissue injury. Knowing the antioxidant content of nutritional components can help make informed decisions about diet design, and increase the commercial value of antioxidant-rich natural products. As a reliable and convenient technique to evaluate the whole spectrum of antioxidants present in biological samples is lacking, the general consensus is to use more than one technique. We have chosen two widely used and inexpensive methods, Trolox-equivalent antioxidant capacity and the ferric reducing antioxidant power assays, to evaluate the antioxidant content of several fruits, and to compare and analyze the correlation between both assays. PMID:22807430

Tomasina, Florencia; Carabio, Claudio; Celano, Laura; Thomson, Leonor



Perceptions and Attitudes of Medical Students towards Two Methods of Assessing Practical Anatomy Knowledge  

PubMed Central

Objectives: Traditionally, summative practical examination in anatomy takes the form of ‘spotters’ consisting of a stream of prosections, radiological images and dissections with pins indicating specific structures. Recently, we have started to administer similar examinations online using the quiz facility in Moodle™ (a free, open-source web application for producing modular internet-based courses) in addition to the traditional format. This paper reports on an investigation into students’ perceptions of each assessment environment. Methods: Over a 3-year period, practical assessment in anatomy was conducted either in traditional format or online via learning management software called Moodle™. All students exposed to the two examination formats at the College of Medicine & Health Sciences, Sultan Qaboos University, Oman, were divided into two categories: junior (Year 3) and senior (Year 4). An evaluation of their perception of both examination formats was conducted using a self-administered questionnaire consisting of restricted and free response items. Results: More than half of all students expressed a clear preference for the online environment and believed it was more exam-friendly. This preference was higher amongst senior students. Compared to females, male students preferred the online environment. Senior students were less likely to study on cadavers when the examination was conducted online. Specimen quality, ability to manage time, and seating arrangements were major advantages identified by students who preferred the online format. Conclusion: Computer-based practical examinations in anatomy appeared to be generally popular with our students. The students adopted a different approach to study when the exam was conducted online as compared to the traditional ‘steeplechase’ format. PMID:22087381

Inuwa, Ibrahim M; Taranikanti, Varna; Al-Rawahy, Maimouna; Habbal, Omar



The UBI-QEP method: A practical theoretical approach to understanding chemistry on transition metal surfaces  

NASA Astrophysics Data System (ADS)

In this review we examine the presently available theoretical techniques for determining metal surface reaction energetics. The unity bond index-quadratic exponential potential (UBI-QEP) method, which provides heats of adsorption and reaction activation barriers with a typical accuracy of 1-3 kcal/mol, emerges as the method with the widest applicability for complex and practically important reaction systems. We discuss in detail the theoretical foundations of the analytic UBI-QEP method which employs the most general two-body interaction potentials. The potential variable, named a bond index, is a general exponential function of the two-center bond distance. The bond indices of interacting bonds are assumed to be conserved at unity (up to the dissociation point), and we cite state-of-the-art ab initio calculations to support this assumption. The UBI-QEP method allows one to calculate the reaction energetics in a straightforward variational way. We summarize the analytic formulas for adsorbate binding energies in various coordination modes and for intrinsic and diffusion activation barriers. We also describe a computer program which makes UBI-QEP calculations fully automated. The normalized bond index-molecular dinamics, (NBI-MD) simulation technique, which is an adaptation of the UBI-QEP reactive potential functions to molecular dynamics, is described. Detailed summaries of applications are given which include the Fischer-Tropsch synthesis, oxygen assisted X-H bond cleavage, hydrogen peroxide, methanol and ammonia syntheses, decomposition and reduction of NO, and SO x chemistry.

Shustorovich, Evgeny; Sellers, Harrell


Visceral fat estimation method by bioelectrical impedance analysis and causal analysis  

NASA Astrophysics Data System (ADS)

It has been clarified that abdominal visceral fat accumulation is closely associated to the lifestyle disease and metabolic syndrome. The gold standard in medical fields is visceral fat area measured by an X-ray computer tomography (CT) scan or magnetic resonance imaging. However, their measurements are high invasive and high cost; especially a CT scan causes X-ray exposure. They are the reasons why medical fields need an instrument for viscera fat measurement with low invasive, ease of use, and low cost. The article proposes a simple and practical method of visceral fat estimation by employing bioelectrical impedance analysis and causal analysis. In the method, abdominal shape and dual impedances of abdominal surface and body total are measured to estimate a visceral fat area based on the cause-effect structure. The structure is designed according to the nature of abdominal body composition to be fine-tuned by statistical analysis. The experiments were conducted to investigate the proposed model. 180 subjects were hired to be measured by both a CT scan and the proposed method. The acquired model explained the measurement principle well and the correlation coefficient is 0.88 with the CT scan measurements.

Nakajima, Hiroshi; Tasaki, Hiroshi; Tsuchiya, Naoki; Hamaguchi, Takehiro; Shiga, Toshikazu



In-Service Teacher Training in Japan and Turkey: A Comparative Analysis of Institutions and Practices  

ERIC Educational Resources Information Center

The purpose of this study is to compare policies and practices relating to teacher in-service training in Japan and Turkey. On the basis of the findings of the study, suggestions are made about in-service training activities in Turkey. The research was carried using qualitative research methods. In-service training activities in the two education…

Bayrakci, Mustafa



Optimum compression to ventilation ratios in CPR under realistic, practical conditions: a physiological and mathematical analysis  

Microsoft Academic Search

Objective: To develop and evaluate a practical formula for the optimum ratio of compressions to ventilations in cardiopulmonary resuscitation (CPR). The optimum value of a variable is that for which a desired result is maximized. Here the desired result is assumed to be either oxygen delivery to peripheral tissues or a combination of oxygen delivery and waste product removal. Method:

Charles F. Babbs; Karl B. Kern



Assessing performance of conservation-based Best Management Practices: Coarse vs. fine-scale analysis  

Technology Transfer Automated Retrieval System (TEKTRAN)

Background/Questions/Methods Animal agriculture in the Spring Creek watershed of central Pennsylvania contributes sediment to the stream and ultimately to the Chesapeake Bay. Best Management Practices (BMPs) such as streambank buffers are intended to intercept sediment moving from heavy-use areas to...


Interpreting the Meaning of Grades: A Descriptive Analysis of Middle School Teachers' Assessment and Grading Practices  

ERIC Educational Resources Information Center

This descriptive, non-experimental, quantitative study was designed to answer the broad question, "What do grades mean?" Core academic subject middle school teachers from one large, suburban school district in Virginia were administered an electronic survey that asked them to report on aspects of their grading practices and assessment methods for…

Grimes, Tameshia Vaden



Confirmatory Factor Analysis on the Professional Suitability Scale for Social Work Practice  

ERIC Educational Resources Information Center

Objective: This article presents a validation study to examine the factor structure of an instrument designed to measure professional suitability for social work practice. Method: Data were collected from registered social workers in a provincial mailed survey. The response rate was 23.2%. After eliminating five cases with multivariate outliers,…

Tam, Dora M. Y.; Twigg, Robert C.; Boey, Kam-Wing; Kwok, Siu-Ming



A Qualitative Analysis of an Advanced Practice Nurse-Directed Transitional Care Model Intervention  

ERIC Educational Resources Information Center

Purpose: The purpose of this study was to describe barriers and facilitators to implementing a transitional care intervention for cognitively impaired older adults and their caregivers lead by advanced practice nurses (APNs). Design and Methods: APNs implemented an evidence-based protocol to optimize transitions from hospital to home. An…

Bradway, Christine; Trotta, Rebecca; Bixby, M. Brian; McPartland, Ellen; Wollman, M. Catherine; Kapustka, Heidi; McCauley, Kathleen; Naylor, Mary D.



Flutter and Divergence Analysis using the Generalized Aeroelastic Analysis Method  

NASA Technical Reports Server (NTRS)

The Generalized Aeroelastic Analysis Method (GAAM) is applied to the analysis of three well-studied checkcases: restrained and unrestrained airfoil models, and a wing model. An eigenvalue iteration procedure is used for converging upon roots of the complex stability matrix. For the airfoil models, exact root loci are given which clearly illustrate the nature of the flutter and divergence instabilities. The singularities involved are enumerated, including an additional pole at the origin for the unrestrained airfoil case and the emergence of an additional pole on the positive real axis at the divergence speed for the restrained airfoil case. Inconsistencies and differences among published aeroelastic root loci and the new, exact results are discussed and resolved. The generalization of a Doublet Lattice Method computer code is described and the code is applied to the calculation of root loci for the wing model for incompressible and for subsonic flow conditions. The error introduced in the reduction of the singular integral equation underlying the unsteady lifting surface theory to a linear algebraic equation is discussed. Acknowledging this inherent error, the solutions of the algebraic equation by GAAM are termed 'exact.' The singularities of the problem are discussed and exponential series approximations used in the evaluation of the kernel function shown to introduce a dense collection of poles and zeroes on the negative real axis. Again, inconsistencies and differences among published aeroelastic root loci and the new 'exact' results are discussed and resolved. In all cases, aeroelastic flutter and divergence speeds and frequencies are in good agreement with published results. The GAAM solution procedure allows complete control over Mach number, velocity, density, and complex frequency. Thus all points on the computed root loci can be matched-point, consistent solutions without recourse to complex mode tracking logic or dataset interpolation, as in the k and p-k solution methods.

Edwards, John W.; Wieseman, Carol D.



Integrated force method versus displacement method for finite element analysis  

NASA Technical Reports Server (NTRS)

A novel formulation termed the integrated force method (IFM) has been developed in recent years for analyzing structures. In this method all the internal forces are taken as independent variables, and the system equilibrium equations (EE's) are integrated with the global compatibility conditions (CC's) to form the governing set of equations. In IFM the CC's are obtained from the strain formulation of St. Venant, and no choices of redundant load systems have to be made, in constrast to the standard force method (SFM). This property of IFM allows the generation of the governing equation to be automated straightforwardly, as it is in the popular stiffness method (SM). In this report IFM and SM are compared relative to the structure of their respective equations, their conditioning, required solution methods, overall computational requirements, and convergence properties as these factors influence the accuracy of the results. Overall, this new version of the force method produces more accurate results than the stiffness method for comparable computational cost.

Patnaik, Surya N.; Berke, Laszlo; Gallagher, Richard H.



Recent Advances of Information Entropy Estimation Method for Practical Hydrological Variables  

NASA Astrophysics Data System (ADS)

The concept of Shannon's information entropy has been widely used in hydrology and water resources. With the increasing application of Bayesian framework and information theory, hydrologists require a robust and accurate method for entropy estimation. Most hydrologists prefer the intuitive bin-counting method to compute entropy, while some more sophisticated methods have also been applied. In this research, we first present the special characteristics of practical hydrological variables, such as 1) zero effect. (e.g. no-rainfall days in daily precipitation series); 2) discrete effect. (e.g. the minimum unit that a measurement equipment can give); 3) optimal bin-width; 4) skewness of data. Then we design special techniques to deal with these characteristics. Furthermore, we extend the techniques to 1D, 2D and high-dimensional entropy, Kullback-Leibler divergence, mutual information and transfer entropy. The last but not the least, we also present a possible improvement of Taylor diagram based on entropy and mutual information.

Gong, W.; Yang, D.; Gupta, H. V.; Nearing, G. S.



A practical method for quickly determining electrode positions in high-density EEG studies.  


This report describes a simple and practical method for determining electrode positions in high-density EEG studies. This method reduces the number of electrodes for which accurate three-dimensional location must be measured, thus minimizing experimental set-up time and the possibility of digitization error. For each electrode cap, a reference data set is first established by placing the cap on a reference head and digitizing the 3-D position of each channel. A set of control channels are pre-selected that should be adequately distributed over the cap. A simple choice could be the standard 19 channels of the International 10-20 system or their closest substitutes. In a real experiment, only the 3-D positions of these control channels need to be measured and the position of each of the remaining channels are calculated from the position data of the same channels in the reference data set using a local transformation determined by the nearest three or four pairs of control channels. Six BioSemi ActiveTwo caps of different size and channel numbers were used to evaluate the method. Results show that the mean prediction error is about 2mm and is comparable with the residual uncertainty in direct position measurement using a Polhemus digitizer. PMID:23485737

He, Ping; Estepp, Justin R



Reducing inpatient suicide risk: using human factors analysis to improve observation practices.  


In 1995, the Joint Commission began requiring that hospitals report reviewable sentinel events as a condition of maintaining accreditation. Since then, inpatient suicide has been the second most common sentinel event reported to the Joint Commission. The Joint Commission emphasizes the need for around-the-clock observation for inpatients assessed as at high risk for suicide. However, there is sparse literature on the observation of psychiatric patients and no systematic studies or recommendations for best practices. Medical errors can best be reduced by focusing on systems improvements rather than individual provider mistakes. The author describes how failure modes and effects analysis (FMEA) was used proactively by an inpatient psychiatric treatment team to improve psychiatric observation practices by identifying and correcting potential observation process failures. Collection and implementation of observation risk reduction strategies across health care systems is needed to identify best practices and to reduce inpatient suicides. PMID:19297628

Janofsky, Jeffrey S



Three-dimensional Stress Analysis Using the Boundary Element Method  

NASA Technical Reports Server (NTRS)

The boundary element method is to be extended (as part of the NASA Inelastic Analysis Methods program) to the three-dimensional stress analysis of gas turbine engine hot section components. The analytical basis of the method (as developed in elasticity) is outlined, its numerical implementation is summarized, and the approaches to be followed in extending the method to include inelastic material response indicated.

Wilson, R. B.; Banerjee, P. K.




Microsoft Academic Search

A general method is introduced to frequency domain analysis of lossy Inhomogeneous Planar Layers (IPLs). In this method, the IPLs are subdivided to several thin homogeneous layers, at first. Then the electric and magnetic fields are obtained using second order finite difference method. The accuracy of the method is studied using analysis of some special types of IPLs.

Mohammad Khalaj-Amirhosseini



Linguistically Diverse Students and Special Education: A Mixed Methods Study of Teachers' Attitudes, Coursework, and Practice  

ERIC Educational Resources Information Center

While the number of linguistically diverse students (LDS) grows steadily in the U.S., schools, research and practice to support their education lag behind (Lucas & Grinberg, 2008). Research that describes the attitudes and practices of teachers who serve LDS and how those attitudes and practice intersect with language and special education is…

Greenfield, Renee A.



Monte Carlo methods for design and analysis of radiation detectors  

Microsoft Academic Search

An overview of Monte Carlo as a practical method for designing and analyzing radiation detectors is provided. The emphasis is on detectors for radiation that is either directly or indirectly ionizing. This overview paper reviews some of the fundamental aspects of Monte Carlo, briefly addresses simulation of radiation transport by the Monte Carlo method, discusses the differences between direct and

William L. Dunn; J. Kenneth Shultis



Methods for analysis of fluoroquinolones in biological fluids  

Technology Transfer Automated Retrieval System (TEKTRAN)

Methods for analysis of 10 selected fluoroquinolone antibiotics in biological fluids are reviewed. Approaches for sample preparation, detection methods, limits of detection and quantitation and recovery information are provided for both single analyte and multi-analyte fluoroquinolone methods....


Statistical adjustments for brain size in volumetric neuroimaging studies: Some practical implications in methods  

PubMed Central

Volumetric magnetic resonance imaging (MRI) brain data provide a valuable tool for detecting structural differences associated with various neurological and psychiatric disorders. Analysis of such data, however, is not always straightforward, and complications can arise when trying to determine which brain structures are “smaller” or “larger” in light of the high degree of individual variability across the population. Several statistical methods for adjusting for individual differences in overall cranial or brain size have been used in the literature, but critical differences exist between them. Using agreement among those methods as an indication of stronger support of a hypothesis is dangerous given that each requires a different set of assumptions be met. Here we examine the theoretical underpinnings of three of these adjustment methods (proportion, residual, and analysis of covariance) and apply them to a volumetric MRI data set. These three methods used for adjusting for brain size are specific cases of a generalized approach which we propose as a recommended modeling strategy. We assess the level of agreement among methods and provide graphical tools to assist researchers in determining how they differ in the types of relationships they can unmask, and provide a useful method by which researchers may tease out important relationships in volumetric MRI data. We conclude with the recommended procedure involving the use of graphical analyses to help uncover potential relationships the ROI volumes may have with head size and give a generalized modeling strategy by which researchers can make such adjustments that include as special cases the three commonly employed methods mentioned above. PMID:21684724

O’Brien, Liam M.; Ziegler, David A.; Deutsch, Curtis K.; Frazier, Jean A.; Herbert, Martha R.; Locascio, Joseph J.



Learning in the Permaculture Community of Practice in England: An Analysis of the Relationship between Core Practices and Boundary Processes  

ERIC Educational Resources Information Center

Purpose: This article utilizes the Communities of Practice (CoP) framework to examine learning processes among a group of permaculture practitioners in England, specifically examining the balance between core practices and boundary processes. Design/methodology/approach: The empirical basis of the article derives from three participatory workshops…

Ingram, Julie; Maye, Damian; Kirwan, James; Curry, Nigel; Kubinakova, Katarina



[Methods and applications of population viability analysis (PVA): a review].  


With the accelerating human consumption of natural resources, the problems associated with endangered species caused by habitat loss and fragmentation have become greater and more urgent than ever. Conceptually associated with the theories of island biogeography, population viability analysis (PVA) has been one of the most important approaches in studying and protecting endangered species, and this methodology has occupied a central place in conservation biology and ecology in the past several decades. PVA has been widely used and proven effective in many cases, but its predictive ability and accuracy are still in question. Also, its application needs expand. To overcome some of the problems, we believe that PVA needs to incorporate some principles and methods from other fields, particularly landscape ecology and sustainability science. Integrating landscape pattern and socioeconomic factors into PVA will make the approach theoretically more comprehensive and practically more useful. Here, we reviewed the history, basic conception, research methods, and modeling applications and their accuracies of PVA, and proposed the perspective in this field. PMID:21548317

Tian, Yu; Wu, Jian-Guo; Kou, Xiao-Jun; Wang, Tian-Ming; Smith, Andrew T; Ge, Jian-Ping



Theoretical Analysis of Heuristic Search Methods for Online POMDPs  

PubMed Central

Planning in partially observable environments remains a challenging problem, despite significant recent advances in offline approximation techniques. A few online methods have also been proposed recently, and proven to be remarkably scalable, but without the theoretical guarantees of their offline counterparts. Thus it seems natural to try to unify offline and online techniques, preserving the theoretical properties of the former, and exploiting the scalability of the latter. In this paper, we provide theoretical guarantees on an anytime algorithm for POMDPs which aims to reduce the error made by approximate offline value iteration algorithms through the use of an efficient online searching procedure. The algorithm uses search heuristics based on an error analysis of lookahead search, to guide the online search towards reachable beliefs with the most potential to reduce error. We provide a general theorem showing that these search heuristics are admissible, and lead to complete and ?-optimal algorithms. This is, to the best of our knowledge, the strongest theoretical result available for online POMDP solution methods. We also provide empirical evidence showing that our approach is also practical, and can find (provably) near-optimal solutions in reasonable time. PMID:21625296

Ross, Stéphane; Pineau, Joelle; Chaib-draa, Brahim



Promoting recovery-oriented practice in mental health services: a quasi-experimental mixed-methods study  

PubMed Central

Background Recovery has become an increasingly prominent concept in mental health policy internationally. However, there is a lack of guidance regarding organisational transformation towards a recovery orientation. This study evaluated the implementation of recovery-orientated practice through training across a system of mental health services. Methods The intervention comprised four full-day workshops and an in-team half-day session on supporting recovery. It was offered to 383 staff in 22 multidisciplinary community and rehabilitation teams providing mental health services across two contiguous regions. A quasi-experimental design was used for evaluation, comparing behavioural intent with staff from a third contiguous region. Behavioural intent was rated by coding points of action on the care plans of a random sample of 700 patients (400 intervention, 300 control), before and three months after the intervention. Action points were coded for (a) focus of action, using predetermined categories of care; and (b) responsibility for action. Qualitative inquiry was used to explore staff understanding of recovery, implementation in services and the wider system, and the perceived impact of the intervention. Semi-structured interviews were conducted with 16 intervention group team leaders post-training and an inductive thematic analysis undertaken. Results A total of 342 (89%) staff received the intervention. Care plans of patients in the intervention group had significantly more changes with evidence of change in the content of patient’s care plans (OR 10.94. 95% CI 7.01-17.07) and the attributed responsibility for the actions detailed (OR 2.95, 95% CI 1.68-5.18). Nine themes emerged from the qualitative analysis split into two superordinate categories. ‘Recovery, individual and practice’, describes the perception and provision of recovery orientated care by individuals and at a team level. It includes themes on care provision, the role of hope, language of recovery, ownership and multidisciplinarity. ‘Systemic implementation’, describes organizational implementation and includes themes on hierarchy and role definition, training approaches, measures of recovery and resources. Conclusions Training can provide an important mechanism for instigating change in promoting recovery-orientated practice. However, the challenge of systemically implementing recovery approaches requires further consideration of the conceptual elements of recovery, its measurement, and maximising and demonstrating organizational commitment. PMID:23764121



Method transfer, partial validation, and cross validation: recommendations for best practices and harmonization from the global bioanalysis consortium harmonization team.  


This paper presents the recommendations of the Global Bioanalytical Consortium Harmonization Team on method transfer, partial validation, and cross validation. These aspects of bioanalytical method validation, while important, have received little detailed attention in recent years. The team has attempted to define, separate, and describe these related activities, and present practical guidance in how to apply these techniques. PMID:25190270

Briggs, R J; Nicholson, R; Vazvaei, F; Busch, J; Mabuchi, M; Mahesh, K S; Brudny-Kloeppel, M; Weng, N; Galvinas, P A R; Duchene, P; Hu, Pei; Abbott, R W



Evidence into practice, experimentation and quasi experimentation: are the methods up to the task?  

PubMed Central

OBJECTIVE: Methodological review of evaluations of interventions intended to help health professionals provide more effective and efficient health care, motivated by the current experience of NHS Research and Development in England. Emphasis upon the forms of research appropriate to different stages in the development and evaluation of interventions, the use of experimental and quasi experimental designs, the methods used in systematic reviews and meta analyses. METHOD: A proposed development process is derived from that used in the development of drugs. The strengths and weaknesses of different experimental and quasi experimental designs are derived from published methodological literature and first principles. Examples are drawn from the literature. RESULTS: Like pharmaceuticals, implementation interventions need to go through several stages of development before they are evaluated in designed experiments. Where there are practical reasons that make random allocation impossible in quantitative evaluations, quasi experimental methods may provide useful information, although these studies are open to bias. It is rare for a single study to provide a complete answer to important questions, and systematic reviews of all available studies should be undertaken. Meta analytic techniques go some way towards countering the low power of many existing studies, reduce the risk of bias, and avoid the subjective approaches that may be found in narrative reviews. CONCLUSIONS: The initiative taken by NHS Research and Development in examining methods to promote the uptake of research findings is welcome, but will only prove helpful if careful attention is paid to the different stages of the development process, and different research approaches are used appropriately at different stages.   PMID:9578853

Freemantle, N.; Wood, J.; Crawford, F.




E-print Network

methods to crypto- graphic protocol analysis spans over twenty years, and recently has been showing signsFORMA METHODS FOR CRYPTOGRAPHIC PROTOCOL ANALYSIS 1 Formal Methods for Cryptographic Protocol to cryptographic proto- cols begins with the analysis of key distribution protocols for communication between two


The solution of fractional wave equation by using modified trial equation method and homotopy analysis method  

NASA Astrophysics Data System (ADS)

In this study, we applied the Homotopy Analysis method to the nonlinear fractional wave equation. Then, we executed a comparison between analytical solution obtained by using Modified Trial Equation method and approximate solution obtained via Homotopy Analysis method. Finally, we investigated errors analysis by drawing 3D and 2D graphics of the solution of fractional wave equation for different value of alpha.

Kocak, Zeynep Fidan; Bulut, Hasan; Yel, Gulnur



Nurses’ self-efficacy and practices relating to weight management of adult patients: a path analysis  

PubMed Central

Background Health professionals play a key role in the prevention and treatment of excess weight and obesity, but many have expressed a lack of confidence in their ability to manage obese patients with their delivery of weight-management care remaining limited. The specific mechanism underlying inadequate practices in professional weight management remains unclear. The primary purpose of this study was to examine a self-efficacy theory-based model in understanding Registered Nurses’ (RNs) professional performance relating to weight management. Methods A self-report questionnaire was developed based upon the hypothesized model and administered to a convenience sample of 588 RNs. Data were collected regarding socio-demographic variables, psychosocial variables (attitudes towards obese people, professional role identity, teamwork beliefs, perceived skills, perceived barriers and self-efficacy) and professional weight management practices. Structural equation modeling was conducted to identify correlations between the above variables and to test the goodness of fit of the proposed model. Results The survey response rate was 71.4% (n?=?420). The respondents reported a moderate level of weight management practices. Self-efficacy directly and positively predicted the weight management practices of the RNs (??=?0.36, p?practices. The final model constructed in this study demonstrated a good fit to the data [?2 (14) =13.90, p?=?0.46; GFI?=?0.99; AGFI?=?0.98; NNFI?=?1.00; CFI?=?1.00; RMSEA?=?0.00; AIC?=?57.90], accounting for 38.4% and 43.2% of the variance in weight management practices and self-efficacy, respectively. Conclusions Self-efficacy theory appears to be useful in understanding the weight management practices of RNs. Interventions targeting the enhancement of self-efficacy may be effective in promoting RNs’ professional performance in managing overweight and obese patients. PMID:24304903



Randomized Comparison of 3 Methods to Screen for Domestic Violence in Family Practice  

PubMed Central

PURPOSE We undertook a study to compare 3 ways of administering brief domestic violence screening questionnaires: self-administered questionnaire, medical staff interview, and physician interview. METHODS We conducted a randomized trial of 3 screening protocols for domestic violence in 4 urban family medicine practices with mostly minority patients. We randomly assigned 523 female patients, aged 18 years or older and currently involved with a partner, to 1 of 3 screening protocols. Each included 2 brief screening tools: HITS and WAST-Short. Outcome measures were domestic violence disclosure, patient and clinician comfort with the screening, and time spent screening. RESULTS Overall prevalence of domestic violence was 14%. Most patients (93.4%) and clinicians (84.5%) were comfortable with the screening questions and method of administering them. Average time spent screening was 4.4 minutes. Disclosure rates, patient and clinician comfort with screening, and time spent screening were similar among the 3 protocols. In addition, WAST-Short was validated in this sample of minority women by comparison with HITS and with the 8-item WAST. CONCLUSIONS Domestic violence is common, and we found that most patients and clinicians are comfortable with domestic violence screening in urban family medicine settings. Patient self-administered domestic violence screening is as effective as clinician interview in terms of disclosure, comfort, and time spent screening. PMID:17893385

Chen, Ping-Hsin; Rovi, Sue; Washington, Judy; Jacobs, Abbie; Vega, Marielos; Pan, Ko-Yu; Johnson, Mark S.



Practical methods for using vegetation patterns to estimate avalanche frequency and magnitude  

NASA Astrophysics Data System (ADS)

Practitioners working in avalanche terrain may never witness an extreme event, but understanding extreme events is important for categorizing avalanches that occur within a given season. Historical records of avalanche incidents and direct observations are the most reliable evidence of avalanche activity, but patterns in vegetation can be used to further quantify and map the frequency and magnitude of past events. We surveyed published literature to synthesize approaches for using vegetation sampling to characterize avalanche terrain, and developed examples to identify the benefits and caveats of using different practical field methods to estimate avalanche frequency and magnitude. Powerful avalanches can deposit massive piles of snow, rocks, and woody debris in runout zones. Large avalanches (relative to the path) can cut fresh trimlines, widening their tracks by uprooting, stripping, and breaking trees. Discs and cores can be collected from downed trees to detect signals of past avalanche disturbance recorded in woody plant tissue. Signals of disturbance events recorded in tree rings can include direct impact scars from the moving snow and wind blast, development of reaction wood in response to tilting, and abrupt variation in the relative width of annual growth rings. The relative ages of trees in avalanche paths and the surrounding landscape can be an indicator of the area impacted by past avalanches. Repeat photography can also be useful to track changes in vegetation over time. For Colorado, and perhaps elsewhere, several vegetation ecology methods can be used in combination to accurately characterize local avalanche frequency and magnitude.

Simonson, S.; Fassnacht, S. R.



Practical method for appearance match between soft copy and hard copy  

NASA Astrophysics Data System (ADS)

CRT monitors are often used as a soft proofing device for the hard copy image output. However, what the user sees on the monitor does not match its output, even if the monitor and the output device are calibrated with CIE/XYZ or CIE/Lab. This is especially obvious when correlated color temperature (CCT) of CRT monitor's white point significantly differs from ambient light. In a typical office environment, one uses a computer graphic monitor having a CCT of 9300K in a room of white fluorescent light of 4150K CCT. In such a case, human visual system is partially adapted to the CRT monitor's white point and partially to the ambient light. The visual experiments were performed on the effect of the ambient lighting. Practical method for soft copy color reproduction that matches the hard copy image in appearance is presented in this paper. This method is fundamentally based on a simple von Kries' adaptation model and takes into account the human visual system's partial adaptation and contrast matching.

Katoh, Naoya



Multi-Spacecraft Turbulence Analysis Methods  

NASA Astrophysics Data System (ADS)

Turbulence is ubiquitous in space plasmas, from the solar wind to supernova remnants, and on scales from the electron gyroradius to interstellar separations. Turbulence is responsible for transporting energy across space and between scales and plays a key role in plasma heating, particle acceleration and thermalisation downstream of shocks. Just as with other plasma processes such as shocks or reconnection, turbulence results in complex, structured and time-varying behaviour which is hard to measure with a single spacecraft. However, turbulence is a particularly hard phenomenon to study because it is usually broadband in nature: it covers many scales simultaneously. One must therefore use techniques to extract information on multiple scales in order to quantify plasma turbulence and its effects. The Cluster orbit takes the spacecraft through turbulent regions with a range of characteristics: the solar wind, magnetosheath, cusp and magnetosphere. In each, the nature of the turbulence (strongly driven or fully evolved; dominated by kinetic effects or largely on fluid scales), as well as characteristics of the medium (thermalised or not; high or low plasma sub- or super-Alfvenic) mean that particular techniques are better suited to the analysis of Cluster data in different locations. In this chapter, we consider a range of methods and how they are best applied to these different regions. Perhaps the most studied turbulent space plasma environment is the solar wind, see Bruno and Carbone [2005]; Goldstein et al. [2005] for recent reviews. This is the case for a number of reasons: it is scientifically important for cosmic ray and solar energetic particle scattering and propagation, for example. However, perhaps the most significant motivations for studying solar wind turbulence are pragmatic: large volumes of high quality measurements are available; the stability of the solar wind on the scales of hours makes it possible to identify statistically stationary intervals to analyse; and, most important of all, the solar wind speed, V SW , is much higher than the local MHD wave speeds. This means that a spacecraft time series is essentially a "snapshot" spatial sample of the plasma along the flow direction, so we can consider measurements at a set of times ti to be at a set of locations in the plasma given by xi = VSW. This approximation,known as Taylor's hypothesis, greatly simplifies the analysis of the data. In contrast, in the magnetosheath the flow speed is lower than the wave speed and therefore temporal changes at the spacecraft are due to a complex combination of the plasma moving over the spacecraft and the turbulent fluctuations propagating in the plasma frame. This is also the case for ion and electron kinetic scale turbulence in the solar wind and dramatically complicates the analysis of the data. As a result, the application of multi-spacecraft techniques such as k filtering to Cluster data (see Chapter 5, which make it possible to disentangle the effects of flow and wave propagation, have probably resulted in the greatest increase in our understanding of magnetosheath turbulence rather than in the solar wind. We can therefore summarise the key advantages for plasma turbulence analysis of multi-spacecraft data sets such as those from Cluster, compared to single spacecraft data. Multiple sampling points allow us to measure how the turbulence varies in many directions, and on a range of scales, simultaneously, enabling the study of anisotropy in ways that have not previously been possible. They also allow us to distinguish between the motion of fluctuations in the plasma and motion of the plasma itself, enabling the study of turbulence in highly disturbed environments such as the magnetosheath. A number of authors have studied turbulence with Cluster data, using different techniques, the choice of which is motivated by the characteristics of the plasma environment in which they are interested. The complexity of both the Cluster data and the problem of turbulence meant that progress early in the mission was rat

Horbury, Tim S.; Osman, Kareem T.


Vitamin D Status of Clinical Practice Populations at Higher Latitudes: Analysis and Applications  

PubMed Central

Background: Inadequate levels of vitamin D (VTD) throughout the life cycle from the fetal stage to adulthood have been correlated with elevated risk for assorted health afflictions. The purpose of this study was to ascertain VTD status and associated determinants in three clinical practice populations living in Edmonton, Alberta, Canada - a locale with latitude of 53°30’N, where sun exposure from October through March is often inadequate to generate sufficient vitamin D. Methods: To determine VTD status, 1,433 patients from three independent medical offices in Edmonton had levels drawn for 25(OH)D as part of their medical assessment between Jun 2001 and Mar 2007. The relationship between demographic data and lifestyle parameters with VTD status was explored. 25(OH)D levels were categorized as follows: (1) Deficient: <40 nmol/L; (2) Insufficient (moderate to mild): 40 to <80 nmol/L; and (3) Adequate: 80–250 nmol/L. Any cases <25 nmol/L were subcategorized as severely deficient for purposes of further analysis. Results: 240 (16.75% of the total sample) of 1,433 patients were found to be VTD ‘deficient’ of which 48 (3.35% of the overall sample) had levels consistent with severe deficiency. 738 (51.5% of the overall sample) had ‘insufficiency’ (moderate to mild) while only 31.75% had ‘adequate’ 25(OH)D levels. The overall mean for 25(OH) D was 68.3 with SD=28.95. VTD status was significantly linked with demographic and lifestyle parameters including skin tone, fish consumption, milk intake, sun exposure, tanning bed use and nutritional supplementation. Conclusion: A high prevalence of hypovitaminosis-D was found in three clinical practice populations living in Edmonton. In view of the potential health sequelae associated with widespread VTD inadequacy, strategies to facilitate translation of emerging epidemiological information into clinical intervention need to be considered in order to address this public health issue. A suggested VTD supplemental intake level is presented for consideration. PMID:19440275

Genuis, Stephen J.; Schwalfenberg, Gerry K.; Hiltz, Michelle N.; Vaselenak, Sharon A.



Method Development for Analysis of Aspirin Tablets.  

ERIC Educational Resources Information Center

Develops a lab experiment for introductory instrumental analysis that requires interference studies and optimizing of conditions. Notes the analysis of the aspirin is by visible spectrophotometric assay. Gives experimental details and discussion. (MVL)

Street, Kenneth W., Jr.



Analysis methods for Atmospheric Cerenkov Telescopes  

E-print Network

Three different analysis techniques for Atmospheric Imaging System are presented. The classical Hillas parameters based technique is shown to be robust and efficient, but more elaborate techniques can improve the sensitivity of the analysis. A comparison of the different analysis techniques shows that they use different information for gamma-hadron separation, and that it is possible to combine their qualities.

Mathieu de Naurois



Reduction methods for MEMS nonlinear dynamic analysis  

Microsoft Academic Search

\\u000a Practical MEMS applications feature non-linear effects that are important to be realistically simulated. Thistypically involves\\u000a large dynamic non-linear finite element (FE) models, and therefore efficient model reduction techniques are of great need.\\u000a Proper Orthogonal Decomposition (POD) is a well-known technique for the effective orderreduction of large dynamic (non-linear)\\u000a systems. POD does not require any knowledge of the system at hand

Paolo Tiso; Daniel J. Rixen


Women's Access and Provider Practices for the Case Management of Malaria during Pregnancy: A Systematic Review and Meta-Analysis  

PubMed Central

Background WHO recommends prompt diagnosis and quinine plus clindamycin for treatment of uncomplicated malaria in the first trimester and artemisinin-based combination therapies in subsequent trimesters. We undertook a systematic review of women's access to and healthcare provider adherence to WHO case management policy for malaria in pregnant women. Methods and Findings We searched the Malaria in Pregnancy Library, the Global Health Database, and the International Network for the Rational Use of Drugs Bibliography from 1 January 2006 to 3 April 2014, without language restriction. Data were appraised for quality and content. Frequencies of women's and healthcare providers' practices were explored using narrative synthesis and random effect meta-analysis. Barriers to women's access and providers' adherence to policy were explored by content analysis using NVivo. Determinants of women's access and providers' case management practices were extracted and compared across studies. We did not perform a meta-ethnography. Thirty-seven studies were included, conducted in Africa (30), Asia (4), Yemen (1), and Brazil (2). One- to three-quarters of women reported malaria episodes during pregnancy, of whom treatment was sought by >85%. Barriers to access among women included poor knowledge of drug safety, prohibitive costs, and self-treatment practices, used by 5%–40% of women. Determinants of women's treatment-seeking behaviour were education and previous experience of miscarriage and antenatal care. Healthcare provider reliance on clinical diagnosis and poor adherence to treatment policy, especially in first versus other trimesters (28%, 95% CI 14%–47%, versus 72%, 95% CI 39%–91%, p?=?0.02), was consistently reported. Prescribing practices were driven by concerns over side effects and drug safety, patient preference, drug availability, and cost. Determinants of provider practices were access to training and facility type (public versus private). Findings were limited by the availability, quality, scope, and methodological inconsistencies of the included studies. Conclusions A systematic assessment of the extent of substandard case management practices of malaria in pregnancy is required, as well as quality improvement interventions that reach all providers administering antimalarial drugs in the community. Pregnant women need access to information on which anti-malarial drugs are safe to use at different stages of pregnancy. Please see later in the article for the Editors' Summary PMID:25093720

Hill, Jenny; D'Mello-Guyett, Lauren; Hoyt, Jenna; van Eijk, Anna M.; ter Kuile, Feiko O.; Webster, Jayne



Analysis of diffraction characteristics of photopolymers by using the FDTD method  

NASA Astrophysics Data System (ADS)

In holographic memories, photopolymer is a hopeful material as a recording medium. To use a photopolymer for holographic memories as practical recording media, it is necessary to clarify the design condition of recording/reproduction characteristics. The coupled-wave analysis (CWA) and the rigorous coupled-wave analysis (RCWA) are widespread methods to analyze diffraction characteristics of volume holographic gratings. However, holographic grating is more complex than simple grating that is presumed in CWA and RCWA. In this study, we analyzed the index change of photopolymer based on a diffusion model and clarified the diffraction characteristics by using the finite-difference time-domain (FDTD) method.

Shimada, K.; Yoshida, S.; Yoshida, N.; Yamamoto, M.




E-print Network

MONTE CARLO ANALYSIS: ESTIMATING GPP WITH THE CANOPY CONDUCTANCE METHOD 1. Overview A novel method performed a Monte Carlo Analysis to investigate the power of our statistical approach: i.e. what and Assumptions The Monte Carlo Analysis was performed as follows: · Natural variation. The only study to date

DeLucia, Evan H.


Numerical analysis of an aeroacoustic field using Complex Variable Method  

E-print Network

Numerical analysis of an aeroacoustic field using Complex Variable Method E. Gaudarda , R of this method is that the sensibility analysis is performed during the numerical simulation by investigat- ing and characterization of acoustic sources in flows. Analysis of aeroacoustic noise generation and propagation often

Boyer, Edmond



EPA Science Inventory

Three different methods of analysis of panels were compared using asthma panel data from a 1970-1971 study done by EPA in Riverhead, New York. The methods were (1) regression analysis using raw attack rates; (2) regression analysis using the ratio of observed attacks to expected ...


Comparison of homotopy analysis method and homotopy perturbation method  

E-print Network

by Liao in 1992 and the homotopy perturbation method (HPM) proposed by He in 1998 are compared through that the HPM is a special case of the HAM when = -1. However, the HPM solution is divergent for all x; Hollow symbols: 15th-order approximation given by the HPM; Filled symbols: 15th-order approximation given

Jeffrey, David


Methods of Phylogenetic Analysis: New Improvements on Old Methods.  

E-print Network

morphological characteristics to molecular sequences (reviewed in Mount, 2001). The result is a tree composed classifications: (1) those methods that use an algorithm to directly build a tree through a series of defined an algorithm to evaluate potential trees based on this criterion (Swofford et al., 1996). The first class



EPA Science Inventory

A comprehensive annotated bibliography of analytical methods for 67 of the chemicals on the Environmental Protection Agency's Hazardous Substances List is presented. Literature references were selected and abstracts of analytical methods were compiled to facilitate rapid and accu...


Implementing a Virtual Community of Practice for Family Physician Training: A Mixed-Methods Case Study  

PubMed Central

Background GP training in Australia can be professionally isolating, with trainees spread across large geographic areas, leading to problems with rural workforce retention. Virtual communities of practice (VCoPs) may provide a way of improving knowledge sharing and thus reducing professional isolation. Objective The goal of our study was to review the usefulness of a 7-step framework for implementing a VCoP for general practitioner (GP) training and then evaluated the usefulness of the resulting VCoP in facilitating knowledge sharing and reducing professional isolation. Methods The case was set in an Australian general practice training region involving 55 first-term trainees (GPT1s), from January to July 2012. ConnectGPR was a secure, online community site that included standard community options such as discussion forums, blogs, newsletter broadcasts, webchats, and photo sharing. A mixed-methods case study methodology was used. Results are presented and interpreted for each step of the VCoP 7-step framework and then in terms of the outcomes of knowledge sharing and overcoming isolation. Results Step 1, Facilitation: Regular, personal facilitation by a group of GP trainers with a co-ordinating facilitator was an important factor in the success of ConnectGPR. Step 2, Champion and Support: Leadership and stakeholder engagement were vital. Further benefits are possible if the site is recognized as contributing to training time. Step 3, Clear Goals: Clear goals of facilitating knowledge sharing and improving connectedness helped to keep the site discussions focused. Step 4, A Broad Church: The ConnectGPR community was too narrow, focusing only on first-term trainees (GPT1s). Ideally there should be more involvement of senior trainees, trainers, and specialists. Step 5, A Supportive Environment: Facilitators maintained community standards and encouraged participation. Step 6, Measurement Benchmarking and Feedback: Site activity was primarily driven by centrally generated newsletter feedback. Viewing comments by other participants helped users benchmark their own knowledge, particularly around applying guidelines. Step 7, Technology and Community: All the community tools were useful, but chat was limited and users suggested webinars in future. A larger user base and more training may also be helpful. Time is a common barrier. Trust can be built online, which may have benefit for trainees that cannot attend face-to-face workshops. Knowledge sharing and isolation outcomes: 28/34 (82%) of the eligible GPT1s enrolled on ConnectGPR. Trainees shared knowledge through online chat, forums, and shared photos. In terms of knowledge needs, GPT1s rated their need for cardiovascular knowledge more highly than supervisors. Isolation was a common theme among interview respondents, and ConnectGPR users felt more supported in their general practice (13/14, 92.9%). Conclusions The 7-step framework for implementation of an online community was useful. Overcoming isolation and improving connectedness through an online knowledge sharing community shows promise in GP training. Time and technology are barriers that may be overcome by training, technology, and valuable content. In a VCoP, trust can be built online. This has implications for course delivery, particularly in regional areas. VCoPs may also have a specific role assisting overseas trained doctors to interpret their medical knowledge in a new context. PMID:24622292

Jones, Sandra C; Caton, Tim; Iverson, Don; Bennett, Sue; Robinson, Laura



An analysis of the rainfall time structure by box counting—some practical implications  

NASA Astrophysics Data System (ADS)

The scale-invariant behavior of the rainfall time structure was investigated by applying the box counting method to rainfall time series. Two years of minute observations, 90 years of daily observations and 170 years of monthly observations were analyzed and the results were interpreted and related to physical properties of the rainfall process. This paper discusses the question of whether an hypothesis of temporal scale invariance is valid for rainfall and the possibilities of using it in practical hydrology.

Olsson, Jonas; Niemczynowicz, Janusz; Berndtsson, Ronny; Larson, Magnus



Strategies and Practices in Off-Label Marketing of Pharmaceuticals: A Retrospective Analysis of Whistleblower Complaints  

PubMed Central

Background Despite regulatory restrictions, off-label marketing of pharmaceutical products has been common in the US. However, the scope of off-label marketing remains poorly characterized. We developed a typology for the strategies and practices that constitute off-label marketing. Methods and Findings We obtained unsealed whistleblower complaints against pharmaceutical companies filed in US federal fraud cases that contained allegations of off-label marketing (January 1996–October 2010) and conducted structured reviews of them. We coded and analyzed the strategic goals of each off-label marketing scheme and the practices used to achieve those goals, as reported by the whistleblowers. We identified 41 complaints arising from 18 unique cases for our analytic sample (leading to US$7.9 billion in recoveries). The off-label marketing schemes described in the complaints had three non–mutually exclusive goals: expansions to unapproved diseases (35/41, 85%), unapproved disease subtypes (22/41, 54%), and unapproved drug doses (14/41, 34%). Manufacturers were alleged to have pursued these goals using four non–mutually exclusive types of marketing practices: prescriber-related (41/41, 100%), business-related (37/41, 90%), payer-related (23/41, 56%), and consumer-related (18/41, 44%). Prescriber-related practices, the centerpiece of company strategies, included self-serving presentations of the literature (31/41, 76%), free samples (8/41, 20%), direct financial incentives to physicians (35/41, 85%), and teaching (22/41, 54%) and research activities (8/41, 20%). Conclusions Off-label marketing practices appear to extend to many areas of the health care system. Unfortunately, the most common alleged off-label marketing practices also appear to be the most difficult to control through external regulatory approaches. Please see later in the article for the Editors' Summary PMID:21483716

Kesselheim, Aaron S.; Mello, Michelle M.; Studdert, David M.



Statistical methods for dealing with publication bias in meta-analysis.  


Publication bias is an inevitable problem in the systematic review and meta-analysis. It is also one of the main threats to the validity of meta-analysis. Although several statistical methods have been developed to detect and adjust for the publication bias since the beginning of 1980s, some of them are not well known and are not being used properly in both the statistical and clinical literature. In this paper, we provided a critical and extensive discussion on the methods for dealing with publication bias, including statistical principles, implementation, and software, as well as the advantages and limitations of these methods. We illustrated a practical application of these methods in a meta-analysis of continuous support for women during childbirth. Copyright © 2014 John Wiley & Sons, Ltd. PMID:25363575

Jin, Zhi-Chao; Zhou, Xiao-Hua; He, Jia



A practical method for quantification of phosphorus- and glycogen-accumulating organism populations in activated sludge systems.  


Enhanced biological phosphorus removal (EBPR) from wastewater relies on the enrichment of activated sludge with phosphorus-accumulating organisms (PAOs). The presence and proliferation of glycogen-accumulating organisms (GAOs), which compete for substrate with PAOs, may be detrimental for EBPR systems, leading to deterioration and, in extreme cases, failure of the process. Therefore, from both process evaluation and modeling perspectives, the estimation of PAO and GAO populations in activated sludge systems is a relevant issue. A simple method for the quantification of PAO and GAO population fractions in activated sludge systems is presented in this paper. To develop such a method, the activity observed in anaerobic batch tests executed with different PAO/GAO ratios, by mixing highly enriched PAO and GAO cultures, was studied. Strong correlations between PAO/GAO population ratios and biomass activity were observed (R2 > 0.97). This served as a basis for the proposal of a simple and practical method to quantify the PAO and GAO populations in activated sludge systems, based on commonly measured and reliable analytical parameters (i.e., mixed liquor suspended solids, acetate, and orthophosphate) without requiring molecular techniques. This method relies on the estimation of the total active biomass population under anaerobic conditions (PAO plus GAO populations), by measuring the maximum acetate uptake rate in the presence of excess acetate. Later, the PAO and GAO populations present in the activated sludge system can be estimated, by taking into account the PAO/GAO ratio calculated on the basis of the anaerobic phosphorus release-to-acetate consumed ratio. The proposed method was evaluated using activated sludge from municipal wastewater treatment plants. The results from the quantification performed following the proposed method were compared with direct population estimations carried out with fluorescence in situ hybridization analysis (determining Candidatus Accumulibacter Phosphatis as PAO and Candidatus Competibacter Phosphatis as GAO). The method showed to be potentially suitable to estimate the PAO and GAO populations regarding the total PAO-GAO biomass. It could be used, not only to evaluate the performance of EBPR systems, but also in the calibration of potential activated sludge mathematical models, regarding the PAO-GAO coexistence. PMID:18198694

López-Vázquez, Carlos M; Hooijmans, Christine M; Brdjanovic, Damir; Gijzen, Huub J; van Loosdrecht, Mark C M



Implementation of infection control best practice in intensive care units throughout Europe: a mixed-method evaluation study  

PubMed Central

Background The implementation of evidence-based infection control practices is essential, yet challenging for healthcare institutions worldwide. Although acknowledged that implementation success varies with contextual factors, little is known regarding the most critical specific conditions within the complex cultural milieu of varying economic, political, and healthcare systems. Given the increasing reliance on unified global schemes to improve patient safety and healthcare effectiveness, research on this topic is needed and timely. The ‘InDepth’ work package of the European FP7 Prevention of Hospital Infections by Intervention and Training (PROHIBIT) consortium aims to assess barriers and facilitators to the successful implementation of catheter-related bloodstream infection (CRBSI) prevention in intensive care units (ICU) across several European countries. Methods We use a qualitative case study approach in the ICUs of six purposefully selected acute care hospitals among the 15 participants in the PROHIBIT CRBSI intervention study. For sensitizing schemes we apply the theory of diffusion of innovation, published implementation frameworks, sensemaking, and new institutionalism. We conduct interviews with hospital health providers/agents at different organizational levels and ethnographic observations, and conduct rich artifact collection, and photography during two rounds of on-site visits, once before and once one year into the intervention. Data analysis is based on grounded theory. Given the challenge of different languages and cultures, we enlist the help of local interpreters, allot two days for site visits, and perform triangulation across multiple data sources. Qualitative measures of implementation success will consider the longitudinal interaction between the initiative and the institutional context. Quantitative outcomes on catheter-related bloodstream infections and performance indicators from another work package of the consortium will produce a final mixed-methods report. Conclusion A mixed-methods study of this scale with longitudinal follow-up is unique in the field of infection control. It highlights the ‘Why’ and ‘How’ of best practice implementation, revealing key factors that determine success of a uniform intervention in the context of several varying cultural, economic, political, and medical systems across Europe. These new insights will guide future implementation of more tailored and hence more successful infection control programs. Trial registration Trial number: PROHIBIT-241928 (FP7 reference number) PMID:23421909



Dietary Diversity and Meal Frequency Practices among Infant and Young Children Aged 6-23 Months in Ethiopia: A Secondary Analysis of Ethiopian Demographic and Health Survey 2011.  


Background. Appropriate complementary feeding practice is essential for growth and development of children. This study aimed to assess dietary diversity and meal frequency practice of infants and young children in Ethiopia. Methods. Data collected in the Ethiopian Demographic and Health Survey (EDHS) from December 2010 to June 2011 were used for this study. Data collected were extracted, arranged, recoded, and analyzed by using SPSS version 17. A total of 2836 children aged 6-23 months were used for final analysis. Both bivariate and multivariate analysis were done to identify predictors of feeding practices. Result. Children with adequate dietary diversity score and meal frequency were 10.8% and 44.7%, respectively. Children born from the richest households showed better dietary diversity score (OR = 0.256). Number of children whose age less than five years was important predictor of dietary diversity (OR = 0.690). Mothers who had exposure to media were more likely to give adequate meal frequency to their children (OR = 0.707). Conclusion. Dietary diversity and meal frequency practices were inadequate in Ethiopia. Wealth quintile, exposure to media, and number of children were affecting feeding practices. Improving economic status, a habit of eating together, and exposure to media are important to improve infant feeding practices in Ethiopia. PMID:24455218

Aemro, Melkam; Mesele, Molla; Birhanu, Zelalem; Atenafu, Azeb



Extraction of brewer's yeasts using different methods of cell disruption for practical biodiesel production.  


The methods of preparation of fatty acids from brewer's yeast and its use in production of biofuels and in different branches of industry are described. Isolation of fatty acids from cell lipids includes cell disintegration (e.g., with liquid nitrogen, KOH, NaOH, petroleum ether, nitrogenous basic compounds, etc.) and subsequent processing of extracted lipids, including analysis of fatty acid and computing of biodiesel properties such as viscosity, density, cloud point, and cetane number. Methyl esters obtained from brewer's waste yeast are well suited for the production of biodiesel. All 49 samples (7 breweries and 7 methods) meet the requirements for biodiesel quality in both the composition of fatty acids and the properties of the biofuel required by the US and EU standards. PMID:25394535

Rezanka, Tomáš; Matoulková, Dagmar; Kolouchová, Irena; Masák, Jan; Viden, Ivan; Sigler, Karel



An experimental study of practical computerized scatter correction methods for prototype digital breast tomosynthesis  

NASA Astrophysics Data System (ADS)

Digital breast tomosynthesis (DBT) is a technique developed to overcome the limitations of conventional digital mammography by reconstructing slices through the breast from projections acquired at different angles. In developing and optimizing DBT, the x-ray scatter reduction technique remains a significant challenge due to projection geometry and radiation dose limitations. The most common approach for scatter reduction technique is a beam-stop-array (BSA) algorithm while this method has a concern of additional exposure to acquire the scatter distribution. The compressed breast is roughly symmetry and the scatter profiles from projection acquired at axially opposite angle are similar to mirror image from each other. The purpose of this study was to apply the BSA algorithm acquiring only two scans with a beam stop array, which estimates scatter distribution with minimum additional exposure. The results of scatter correction with angular interpolation were comparable to those of scatter correction with all scatter distributions at each angle and exposure increase was less than 13%. This study demonstrated the influence of scatter correction by BSA algorithm with minimum exposure which indicates the practical application in clinical situations.

Kim, Y.; Kim, H.; Park, H.; Choi, J.; Choi, Y.



In practice, the theory is different: a processual analysis of breastfeeding in northeast Brazil.  


'Na prática, a teoria è outra' (in practice, the theory is different) is an old Brazilian saying. This phrase summarizes well the general practice of breastfeeding in Brazil: 'Breast is best' is central in the pregnant women's future oriented 'theory' of how their infant should be fed. In the subsequent weeks after delivery, however, in the daily practicalities of feeding their infant, this theory is, to a large extent, abandoned. The present study is based on a sample of 300 mothers in the city of Aracaju in the Northeast of Brazil. Through interviews, the differences and similarities between knowledge and practice with respect to infant feeding were established. An explanation of these differences is developed on the basis of a processual analysis of the qualitative and quantitative results of the interview data. Nearly all mothers were knowledgeable of the need to breastfeed, and nearly all mothers had initiated breastfeeding. However, only a minority was exclusively breastfeeding at the time of the interview. A distinction is made between a breastfeeding process and a de-breastfeeding process. The data suggest that mothers, in general, start the de-breastfeeding process with the positive intention of ameliorating the infant's situation without realizing the negative processual consequences that most likely ends in a cessation of breastfeeding. The study supports the view that health policy should underline the processual character of both breastfeeding and de-breastfeeding when promoting the importance of exclusive breastfeeding. PMID:17070973

Scavenius, Michael; van Hulsel, Lonneke; Meijer, Julia; Wendte, Hans; Gurgel, Ricardo




Microsoft Academic Search

The prevalence of complex acoustic structures in mammalian vocalisations can make it difficult to quantify frequency characteristics. We describe two methods developed for the frequency analysis of a complex swift fox Vulpes velox vocalisation, the barking sequence: (1) autocorrelation function analysis and (2) instantaneous frequency analysis. The autocorrelation function analysis results in an energy density spectrum of the signal's averaged




Managing visitor impacts in parks: A multi-method study of the effectiveness of alternative management practices  

USGS Publications Warehouse

How can recreation use be managed to control associated environmental impacts? What management practices are most effective and why? This study explored these and related questions through a series of experimental ?treatments? and associated ?controls? at the summit of Cadillac Mountain in Acadia National Park, a heavily used and environmentally fragile area. The treatments included five management practices designed to keep visitors on maintained trails, and these practices ranged from ?indirect? (information/education) to ?direct? (a fence bordering the trail). Research methods included unobtrusive observation of visitors to determine the percentage of visitors who walked off-trail and a follow-up visitor survey to explore why management practices did or didn?t work. All of the management practices reduced the percentage of visitors who walked off-trail. More aggressive applications of indirect practices were more effective than less aggressive applications, and the direct management practice of fencing was the most effective of all. None of the indirect management practices reduced walking off-trail to a degree that is likely to control damage to soil and vegetation at the study site. Study findings suggest that an integrated suite of direct and indirect management practices be implemented on Cadillac Mountain (and other, similar sites) that includes a) a regulation requiring visitors to stay on the maintained trail, b) enforcement of this regulation as needed, c) unobtrusive fencing along the margins of the trail, d) redesign of the trail to extend it, widen it in key places, and provide short spur trails to key ?photo points?, and e) an aggressive information/education program to inform visitors of the regulation to stay on the trail and the reasons for it. These recommendations are a manifestation of what may be an emerging principle of park and outdoor recreation management: intensive use requires intensive management.

Park, L.O.; Marion, J.L.; Manning, R.E.; Lawson, S.R.; Jacobi, C.



Multiscale Methods for Nuclear Reactor Analysis  

NASA Astrophysics Data System (ADS)

The ability to accurately predict local pin powers in nuclear reactors is necessary to understand the mechanisms that cause fuel pin failure during steady state and transient operation. In the research presented here, methods are developed to improve the local solution using high order methods with boundary conditions from a low order global solution. Several different core configurations were tested to determine the improvement in the local pin powers compared to the standard techniques, that use diffusion theory and pin power reconstruction (PPR). Two different multiscale methods were developed and analyzed; the post-refinement multiscale method and the embedded multiscale method. The post-refinement multiscale methods use the global solution to determine boundary conditions for the local solution. The local solution is solved using either a fixed boundary source or an albedo boundary condition; this solution is "post-refinement" and thus has no impact on the global solution. The embedded multiscale method allows the local solver to change the global solution to provide an improved global and local solution. The post-refinement multiscale method is assessed using three core designs. When the local solution has more energy groups, the fixed source method has some difficulties near the interface: however the albedo method works well for all cases. In order to remedy the issue with boundary condition errors for the fixed source method, a buffer region is used to act as a filter, which decreases the sensitivity of the solution to the boundary condition. Both the albedo and fixed source methods benefit from the use of a buffer region. Unlike the post-refinement method, the embedded multiscale method alters the global solution. The ability to change the global solution allows for refinement in areas where the errors in the few group nodal diffusion are typically large. The embedded method is shown to improve the global solution when it is applied to a MOX/LEU assembly interface, the fuel/reflector interface, and assemblies where control rods are inserted. The embedded method also allows for multiple solution levels to be applied in a single calculation. The addition of intermediate levels to the solution improves the accuracy of the method. Both multiscale methods considered here have benefits and drawbacks, but both can provide improvements over the current PPR methodology.

Collins, Benjamin S.


Predatory vs. Dialogic Ethics: Constructing an Illusion or Ethical Practice as the Core of Research Methods  

ERIC Educational Resources Information Center

The ethical conduct of research is addressed from two perspectives, as a regulatory enterprise that creates an illusion of ethical practice and as a philosophical concern for equity and the imposition of power within the conceptualization and practice of research itself. The authors discuss various contemporary positions that influence…

Cannella, Gaile S.; Lincoln, Yvonna S.



Crossing the classroom-clinical practice divide in palliative care by using quality improvement methods  

Microsoft Academic Search

Palliative care has come of age. It is an established specialty with standards of practice and mechanisms to deliver them. Nonetheless, a gap continues to exist between the standards to which palliative care aspires and those that are achieved in practice. Education dissemination has \\

Linda Emanuel



Talking the talk: a discourse analysis of mental health nurses talking about their practice.  


Mental Health nursing exists as a discipline in the UK within the wider contemporary health care establishment. Throughout its history it has attempted to define itself in ways that differentiate mental health nursing practice from other health care professions and fields of nursing. However, it is not surprising in this climate of contemporary healthcare for individual professional identities to become 'lost' in the melange of interdisciplinary practice. This research presents a discourse analysis of individual mental health nurses' rhetorical constructions of their professional role(s) as they emerge in their talk with each other in focus group discussions. In particular, the focus in this paper is their discursive repertoires related to the historical legacy of mental health nursing and how this sits with what they consider to be a 'custodial and controlling' element of their role. The particular discourse analytic approach adopted in this study illustrates how individuals use language in a particular way to make justifications and explanations of mental health nursing identities. This analytic approach is ensconced within the domain of social psychology and lies at the interface of ethno-methodology and conversation analysis. It is concerned with structural units of discourse, beyond the level of the sentence, that emerge as the nurse participants engage in talking about their practice (Potter and Wetherell, 1987 p.53). PMID:15468606

Leishman, June L



Comparison and analysis of three different methods for the paraboloid testing  

NASA Astrophysics Data System (ADS)

Paraboloid is a special asphere and is widely used in optical system. This paper is focused on the Paraboloid testing error analysis and synthesis in practical. A paraboloid from ZYGO is tested using three different methods which are computer generated hologram (CGH), small ball and multi-zone stitching metrology method. The three methods are realized in a fine controlled lab utilizing ZYGO FIZEAU interferometer and the test results are compared and analyzed. Error synthesis is also performed and the uncertainty of the tests is less than 4.5nm.

Miao, Er-long




EPA Science Inventory

Generation of accurate ambient air VOC pollutant mcasurement dataas a base for regulatory decisions is critical. umerous methodsand procedures for sampling and analysis are available from avariety of sources. ir methods available through theEnvironmental Protection Agency are con...


Shear Lag in Box Beams Methods of Analysis and Experimental Investigations  

NASA Technical Reports Server (NTRS)

The bending stresses in the covers of box beams or wide-flange beams differ appreciably from the stresses predicted by the ordinary bending theory on account of shear deformation of the flanges. The problem of predicting these differences has become known as the shear-lag problem. The first part of this paper deals with methods of shear-lag analysis suitable for practical use. The second part of the paper describes strain-gage tests made by the NACA to verify the theory. Three tests published by other investigators are also analyzed by the proposed method. The third part of the paper gives numerical examples illustrating the methods of analysis. An appendix gives comparisons with other methods, particularly with the method of Ebner and Koller.

Kuhn, Paul; Chiarito, Patrick T



Common cause analysis : a review and extension of existing methods  

E-print Network

The quantitative common cause analysis code, MOBB, is extended to include uncertainties arising from modelling uncertainties and data uncertainties. Two methods, Monte Carlo simulation and the Method-of-Moments are used ...

Heising, Carolyn D.



Adaptive Nodal Transport Methods for Reactor Transient Analysis  

SciTech Connect

Develop methods for adaptively treating the angular, spatial, and time dependence of the neutron flux in reactor transient analysis. These methods were demonstrated in the DOE transport nodal code VARIANT and the US NRC spatial kinetics code, PARCS.

Thomas Downar; E. Lewis



Analysis and evaluation of planned and delivered dose distributions: practical concerns with ?- and ?- Evaluations  

NASA Astrophysics Data System (ADS)

One component of clinical treatment validation, for example in the commissioning of new radiotherapy techniques or in patient specific quality assurance, is the evaluation and verification of planned and delivered dose distributions. Gamma and related tests (such as the chi evaluation) have become standard clinical tools for such work. Both functions provide quantitative comparisons between dose distributions, combining dose difference and distance to agreement criteria. However, there are some practical considerations in their utilization that can compromise the integrity of the tests, and these are occasionally overlooked especially when the tests are too readily adopted from commercial software. In this paper we review the evaluation tools and describe some practical concerns. The intent is to provide users with some guidance so that their use of these evaluations will provide valid rapid analysis and visualization of the agreement between planned and delivered dose distributions.

Schreiner, L. J.; Holmes, O.; Salomons, G.



Improved permeability prediction using multivariate analysis methods  

E-print Network

Predicting rock permeability from well logs in uncored wells is an important task in reservoir characterization. Due to the high costs of coring and laboratory analysis, typically cores are acquired in only a few wells. Since most wells are logged...

Xie, Jiang



"Does RE Work?" An Analysis of the Aims, Practices and Models of Effectiveness of Religious Education in the UK  

ERIC Educational Resources Information Center

Possibly the largest qualitative study in RE policy and practice in many years, the AHRC/ESRC Religion and Society project "Does RE work? An analysis of the aims, practices and models of effectiveness in religious education in the UK", headed by the University of Glasgow, seeks to map the complex processes of curriculum formation as experienced in…

Lundie, David



Meta-analysis methods for risk differences.  


The difference between two proportions, referred to as a risk difference, is a useful measure of effect size in studies where the response variable is dichotomous. Confidence interval methods based on a varying coefficient model are proposed for combining and comparing risk differences from multi-study between-subjects or within-subjects designs. The proposed methods are new alternatives to the popular constant coefficient and random coefficient methods. The proposed varying coefficient methods do not require the constant coefficient assumption of effect size homogeneity, nor do they require the random coefficient assumption that the risk differences from the selected studies represent a random sample from a normally distributed superpopulation of risk differences. The proposed varying coefficient methods are shown to have excellent finite-sample performance characteristics under realistic conditions. PMID:23962020

Bonett, Douglas G; Price, Robert M



Knowledge, attitude and practice of natural family planning methods in a population with poor utilisation of modern contraceptives.  


Sub-Saharan Africa has one of the highest fertility rates in the world, which is further promoted by the low utilisation of modern contraceptive methods. Yet, many communities claim to have traditional methods of family planning that pre-date the introduction of modern contraceptives, implying that contraception is a culturally acceptable norm. It was therefore postulated that the study population would have a high level of awareness and practice of natural methods of family planning. We aimed to obtain an insight into the extent and correctness of knowledge about natural family planning methods, and its practice as a guide to the general acceptance of contraception as a concept. Pre-tested structured questionnaires were administered to women of childbearing age in households properly numbered for primary healthcare activities. The level of awareness of natural family planning methods was significantly less than awareness for modern methods of contraception. The awareness rate for rhythm method, lactational amenorrhoea method and coitus interruptus was 50.7%, 42.1% and 36.1%, respectively. For all three national family planning methods, there is a steady decline between awareness, correct description of method and utilisation, a difference that was statistically significant in all cases. The sociodemographic factors of the responders had varying influence on utilisation of all three natural family planning methods studied. Rural dwellers practised the lactational amenorrhoea method significantly more often than urban dwellers. Significantly more Muslims than Christians with four children or more practised coitus interruptus or the rhythm method, while the use of lactational amenorrhoea method was significantly increased with the number of living children in both religious groups. There is a relatively low level of awareness of natural family planning methods in the study population, poor utilisation and wrong use of methods. Therefore, improving the correct level of information on natural family planning methods is likely to improve the use of both natural family planning and modern contraceptive methods. PMID:17000506

Audu, B M; Yahya, S J; Bassi, A



The CITRA Research-Practice Consensus-Workshop Model: Exploring a New Method of Research Translation in Aging  

ERIC Educational Resources Information Center

Purpose: On the basis of the experience of an extensive community-based research partnership in New York City, we developed an innovative process for bridging the gap between aging-related research and practice, using a consensus-workshop model. Design and Methods: We adapted the traditional scientific consensus-workshop model to include…

Sabir, Myra; Breckman, Risa; Meador, Rhoda; Wethington, Elaine; Reid, Carrington; Pillemer, Karl



Practical preparation of potentially anesthetic uorinated ethyl methyl ethers by means of bromine triuoride and other methods  

E-print Network

Practical preparation of potentially anesthetic ¯uorinated ethyl methyl ethers by means of bromine of his 85th birthday Abstract Synthetic methods, especially those that use bromine tri Elsevier Science S.A. All rights reserved. Keywords: Fluorinated ethyl methyl ethers; Bromine tri

Hudlicky, Tomas


Cases on Successful E-Learning Practices in the Developed and Developing World: Methods for the Global Information Economy  

ERIC Educational Resources Information Center

E-learning has become a significant aspect of training and education in the worldwide information economy as an attempt to create and facilitate a competent global work force. "Cases on Successful E-Learning Practices in the Developed and Developing World: Methods for the Global Information Economy" provides eclectic accounts of case studies in…

Olaniran, Bolanle A., Ed.



Spectrophotometric method for analysis of metformin hydrochloride.  


A simple and sensitive spectrophotometric method has been developed and validated for the estimation of metformin hydrochloride in bulk and in tablet formulation. The primary amino group of metformin hydrochloride reacts with ninhydrin in alkaline medium to form a violet colour chromogen, which is determined spectrophotometrically at 570 nm. It obeyed Beer's law in the range of 8-18 mug/ml. Percentage recovery of the drug for the proposed method ranged from 97-100% indicating no interference of the tablet excipients. The proposed method was found to be accurate and precise for routine estimation of metformin hydrochloride in bulk and from tablet dosage forms. PMID:20177473

Mubeen, G; Noor, Khalikha



Development of quantitative duplex real-time PCR method for screening analysis of genetically modified maize.  


A duplex real-time PCR method was developed for quantitative screening analysis of GM maize. The duplex real-time PCR simultaneously detected two GM-specific segments, namely the cauliflower mosaic virus (CaMV) 35S promoter (P35S) segment and an event-specific segment for GA21 maize which does not contain P35S. Calibration was performed with a plasmid calibrant specially designed for the duplex PCR. The result of an in-house evaluation suggested that the analytical precision of the developed method was almost equivalent to those of simplex real-time PCR methods, which have been adopted as ISO standard methods for the analysis of GMOs in foodstuffs and have also been employed for the analysis of GMOs in Japan. In addition, this method will reduce both the cost and time requirement of routine GMO analysis by half. The high analytical performance demonstrated in the current study would be useful for the quantitative screening analysis of GM maize. We believe the developed method will be useful for practical screening analysis of GM maize, although interlaboratory collaborative studies should be conducted to confirm this. PMID:19602858

Oguchi, Taichi; Onishi, Mari; Minegishi, Yasutaka; Kurosawa, Yasunori; Kasahara, Masaki; Akiyama, Hiroshi; Teshima, Reiko; Futo, Satoshi; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi



Emergency Response Training Practices for People With DisabilitiesAnalysis of Some Current Practices and Recommendations for Future Training Programs  

Microsoft Academic Search

Each year thousands of people are potentially affected by the types of emergency preparedness and response training plans practiced in their communities. Between 1998 and 2002, 3,000 counties in the United States declared disasters that have included floods, tornadoes, hurricanes, winter storms, thunderstorms, fires, ice storms, and earthquakes. Emergency preparedness for all people, including people with disabilities, may involve natural

Jennifer L. Rowland; Glen W. White; Michael H. Fox; Catherine Rooney



A practical surface panel method to predict velocity distribution around a three-dimensional hydrofoil including boundary layer effects  

Microsoft Academic Search

A practical, low order and potential-based surface panel method is presented to predict the flow around a three-dimensional rectangular foil section including the effect of boundary layer. The method is based on a boundary-integral formulation, known as the “Morino formulation” and the boundary layer effect is taken into account through a complementary thin boundary layer model. The numerical approach used

A. C. Takinaci; M. Atlar; E. Korkut



Using smart mobile devices in social-network-based health education practice: a learning behavior analysis.  


Virtual communities provide numerous resources, immediate feedback, and information sharing, enabling people to rapidly acquire information and knowledge and supporting diverse applications that facilitate interpersonal interactions, communication, and sharing. Moreover, incorporating highly mobile and convenient devices into practice-based courses can be advantageous in learning situations. Therefore, in this study, a tablet PC and Google+ were introduced to a health education practice course to elucidate satisfaction of learning module and conditions and analyze the sequence and frequency of learning behaviors during the social-network-based learning process. According to the analytical results, social networks can improve interaction among peers and between educators and students, particularly when these networks are used to search for data, post articles, engage in discussions, and communicate. In addition, most nursing students and nursing educators expressed a positive attitude and satisfaction toward these innovative teaching methods, and looked forward to continuing the use of this learning approach. PMID:24568697

Wu, Ting-Ting