Note: This page contains sample records for the topic practical analysis method from Science.gov.
While these samples are representative of the content of Science.gov,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of Science.gov
to obtain the most current and comprehensive results.
Last update: August 15, 2014.
1

Modal analysis of practical quartz resonators using finite element method.  

PubMed

Finite element simulation has been employed as a promising method in the analysis of piezoelectric active devices. It is desirable to obtain results consistent with experiments for real devices. In this paper, precise geometry models of quartz resonators were generated from direct measurements of practical devices and meshed. Realistic quartz blanks, electrodes, and mountings were included in the models. An application- specified 3-D finite element simulator was developed to perform the modal analysis of realistic crystal resonators. Using the model and simulator, real devices were simulated. Simulation results, including modal analysis and temperature properties, were shown in consistency with experimental measurements. The dependencies of device performance on practical factors were studied. PMID:20178895

Yang, Liu; Vitchev, Nikolay; Yu, Zhiping

2010-01-01

2

A Practical Method of Policy Analysis by Simulating Policy Options  

ERIC Educational Resources Information Center

This article focuses on a method of policy analysis that has evolved from the previous articles in this issue. The first section, "Toward a Theory of Educational Production," identifies concepts from science and achievement production to be incorporated into this policy analysis method. Building on Kuhn's (1970) discussion regarding paradigms, the…

Phelps, James L.

2011-01-01

3

A Practical Method of Policy Analysis by Estimating Effect Size  

ERIC Educational Resources Information Center

The previous articles on class size and other productivity research paint a complex and confusing picture of the relationship between policy variables and student achievement. Missing is a conceptual scheme capable of combining the seemingly unrelated research and dissimilar estimates of effect size into a unified structure for policy analysis and…

Phelps, James L.

2011-01-01

4

A topography analysis incorporated optimization method for the selection and placement of best management practices.  

PubMed

Best Management Practices (BMPs) are one of the most effective methods to control nonpoint source (NPS) pollution at a watershed scale. In this paper, the use of a topography analysis incorporated optimization method (TAIOM) was proposed, which integrates topography analysis with cost-effective optimization. The surface status, slope and the type of land use were evaluated as inputs for the optimization engine. A genetic algorithm program was coded to obtain the final optimization. The TAIOM was validated in conjunction with the Soil and Water Assessment Tool (SWAT) in the Yulin watershed in Southwestern China. The results showed that the TAIOM was more cost-effective than traditional optimization methods. The distribution of selected BMPs throughout landscapes comprising relatively flat plains and gentle slopes, suggests the need for a more operationally effective scheme, such as the TAIOM, to determine the practicability of BMPs before widespread adoption. The TAIOM developed in this study can easily be extended to other watersheds to help decision makers control NPS pollution. PMID:23349917

Shen, Zhenyao; Chen, Lei; Xu, Liang

2013-01-01

5

Practical method for radioactivity distribution analysis in small-animal PET cancer studies  

PubMed Central

We present a practical method for radioactivity distribution analysis in small-animal tumors and organs using positron emission tomography imaging with a calibrated source of known activity and size in the field of view. We reconstruct the imaged mouse together with a source under the same conditions, using an iterative method, Maximum Likelihood Expectation-Maximization with System Modeling, capable of delivering high resolution images. Corrections for the ratios of geometrical efficiencies, radioisotope decay in time and photon attenuation are included in the algorithm. We demonstrate reconstruction results for the amount of radioactivity within the scanned mouse in a sample study of osteolytic and osteoblastic bone metastasis from prostate cancer xenografts. Data acquisition was performed on the small-animal PET system which was tested with different radioactive sources, phantoms and animals to achieve high sensitivity and spatial resolution. Our method uses high resolution images to determine the volume of organ or tumor and the amount of their radioactivity, has the possibility of saving time, effort and the necessity to sacrifice animals. This method has utility for prognosis and quantitative analysis in small-animal cancer studies, and will enhance the assessment of characteristics of tumor growth, identifying metastases, and potentially determining the effectiveness of cancer treatment. The possible application for this technique could be useful for the organ radioactivity dosimetry studies.

Slavine, Nikolai V.; Antich, Peter P.

2008-01-01

6

Development of a practical performance aberration retrieval method from spot intensity images using inverse analysis  

Microsoft Academic Search

A “practically applicable” spot images based aberration retrieval method was developed. The method was systemized with the\\u000a following techniques. 1) The real part and imaginary part of the spatial spectrum of the focal plane were expanded with the\\u000a finite lower Zernike polynomials, respectively, and the method was reduced to the nonlinear least squares problem which calculates\\u000a these coefficients. This technique

Masashi Ueshima; Kenji Amaya; Kosei Kataoka

2009-01-01

7

Methods and practices used in incident analysis in the Finnish nuclear power industry.  

PubMed

According to the Finnish Nuclear Energy Act it is licensee's responsibility to ensure safe use of nuclear energy. Radiation and Nuclear Safety Authority (STUK) is the regulatory body responsible for the state supervision of the safe use of nuclear power in Finland. One essential prerequisite for the safe and reliable operation of nuclear power plants is that lessons are learned from the operational experience. It is utility's prime responsibility to assess the operational events and implement appropriate corrective actions. STUK controls licensees' operational experience feedback arrangements and implementation as part of its inspection activities. In addition to this in Finland, the regulatory body performs its own assessment of the operational experience. Review and investigation of operational events is a part of the regulatory oversight of operational safety. Review of operational events is done by STUK basically at three different levels. First step is to perform a general review of all operational events, transients and reactor scram reports, which the licensees submit for information to STUK. The second level activities are related to the clarification of events at site and entering of events' specific data into the event register database of STUK. This is done for events which meet the set criteria for the operator to submit a special report to STUK for approval. Safety significance of operational events is determined using probabilistic safety assessment (PSA) techniques. Risk significance of events and the number of safety significant events are followed by STUK indicators. The final step in operational event assessment performed by STUK is to assign STUK's own investigation team for events deemed to have special importance, especially when the licensee's organisation has not operated as planned. STUK launches its own detail investigation once a year on average. An analysis and evaluation of event investigation methods applied at STUK, and at the two Finnish nuclear power plant operators Teollisuuden Voima Oy (TVO) and Fortum Power and Heat Oy (Fortum) was carried out by the Technical Research Centre (VTT) on request of STUK at the end of 1990s. The study aimed at providing a broad overview and suggestions for improvement of the whole organisational framework to support event investigation practices at the regulatory body and at the utilities. The main objective of the research was to evaluate the adequacy and reliability of event investigation analysis methods and practices in the Finnish nuclear power industry and based on the results to further develop them. The results and suggestions of the research are reviewed in the paper and the corrective actions implemented in event investigation and operating experience procedures both at STUK and at utilities are discussed as well. STUK has developed its own procedure for the risk-informed analysis of nuclear power plant events. The PSA based event analysis method is used to assess the safety significance and importance measures associated with the unavailability of components and systems subject to Technical Specifications. The insights from recently performed PSA based analyses are also briefly discussed in the paper. PMID:15231350

Suksi, Seija

2004-07-26

8

A practical, high-resolution, microcomputer-based method for the analysis of relaxation data exhibiting multicomponent exponential decays.  

PubMed

We have developed and tested a practical, rapid, high-resolution, microcomputer-based method for the analysis of multicomponent exponential decays. The analysis utilizes the Fourier deconvolution technique and includes methods to reduce noise both in the input data and in the results. The developed method is particularly well suited for analysing decays consisting of a wide range of decay times. The method resolves two exponential decays differing by a factor of two when the input data are mathematically generated and without noise, and resolves two decays differing by a factor of three when 2% Gaussian noise is present in the same data. The method lends itself to routine analysis of any relaxation process consisting of exponential decays, including biomedical applications such as enzyme kinetics, circulatory transport functions, pharmacokinetics, plasma exchange therapy, and analysis of compartmental models for any process. PMID:3980124

Mikkelsen, A; Stokke, B T; Elgsaeter, A

1985-01-01

9

Comparing Different Methods for Implementing Parallel Analysis: A Practical Index of Accuracy.  

ERIC Educational Resources Information Center

Accuracy was compared for three methods of implementing parallel analysis with mean eigenvalues (regression, interpolation, and computation with three samples of random data). The accuracy of parallel analysis with 95th percentile eigenvalues (through regression and interpolation) was also considered. No evidence of differential accuracy emerged…

Cota, Albert A.; And Others

1993-01-01

10

ISO 14 001 at the farm level: analysis of five methods for evaluating the environmental impact of agricultural practices.  

PubMed

Faced with society's increasing expectations, the Common Agricultural Policy (CAP) review considers environmental management to be an ever more critical criterion in the allocation of farm subsidies. With the goal of evaluating the environmental friendliness of farm practices, France's agricultural research and extension services have built a range of agricultural/environmental diagnostic tools over recent years. The objective of the present paper is to compare the five tools most frequently used in France: IDEA, DIAGE, DIALECTE, DIALOGUE and INDIGO. All the tools have the same purpose: evaluation of the impact of farm practices on the environment via indicators and monitoring of farm management practices. When tested on a sample of large-scale farms in Picardie, the five tools sometimes produced completely different results: for a given farm, the most supposedly significant environmental impacts depend on the tool used. These results lead to differing environmental management plans and raise the question of the methods' pertinence. An analysis grid of diagnostic tools aimed at specifying their field of validity, limits and relevance was drawn up. The resulting comparative analysis enables to define each tool's domain of validity and allows to suggest lines of thought for developing more relevant tools for (i) evaluating a farm's environmental performance and (ii) helping farmers to develop a plan for improving practices within the framework of an environmental management system. PMID:17084504

Galan, M B; Peschard, D; Boizard, H

2007-02-01

11

Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory?Determination of Trihalomethane Formation Potential, Method Validation, and Quality-Control Practices  

USGS Publications Warehouse

An analytical method for the determination of the trihalomethane formation potential of water samples has been developed. The trihalomethane formation potential is measured by dosing samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine, and then analyzing the resulting trihalomethanes by purge and trap/gas chromatography equipped with an electron capture detector. Detailed explanations of the method and quality-control practices are provided. Method validation experiments showed that the trihalomethane formation potential varies as a function of time between sample collection and analysis, residual-free chlorine concentration, method of sample dilution, and the concentration of bromide in the sample.

Crepeau, Kathryn L.; Fram, Miranda S.; Bush, Noel

2004-01-01

12

Treatise on analytical chemistry. Part I. Theory and practice. Volume 7. Section H. Optical methods of analysis  

SciTech Connect

This book is one in a series of volumes and deals only with optical methods of analysis. This revision of a first edition of the volume, which presented a concise, critical, comprehensive, and systematic treatment of all aspects of classical and modern analytical chemistry, updates the techniques of analytical chemistry with experts in various specific fields as coeditors. Each chapter in the book illustrates how analytical chemistry draws on the fundamentals of chemistry as well as on those of other sciences. Discussion of practical applications of techniques is limited to fundamentals and to the analytical interpretation of the results obtained. The various optical methods are covered in 8 separate chapters entitled: Optical Methods, Emission and Absorption of Radiant Energy, Fundamentals of Spectrophotometry, Spectroscopic Apparatus and Measurements, Luminescence Spectrometry, (Fluorimetry and Phosphorimetry), Infrared Spectroscopy, Emission Spectroscopy, Flame Emission Spectroscopy, and Atomic Absorption Spectroscopy. (BLM)

Elving, P.J. (ed.)

1981-01-01

13

Empowering Discourse: Discourse Analysis as Method and Practice in the Sociology Classroom  

ERIC Educational Resources Information Center

Collaborative learning and critical pedagogy are widely recognized as "empowering" pedagogies for higher education. Yet, the practical implementation of both has a mixed record. The question, then, is: How could collaborative and critical pedagogies be empowered themselves? This paper makes a primarily theoretical case for discourse…

Hjelm, Titus

2013-01-01

14

APA's Learning Objectives for Research Methods and Statistics in Practice: A Multimethod Analysis  

ERIC Educational Resources Information Center

Research methods and statistics courses constitute a core undergraduate psychology requirement. We analyzed course syllabi and faculty self-reported coverage of both research methods and statistics course learning objectives to assess the concordance with APA's learning objectives (American Psychological Association, 2007). We obtained a sample of…

Tomcho, Thomas J.; Rice, Diana; Foels, Rob; Folmsbee, Leah; Vladescu, Jason; Lissman, Rachel; Matulewicz, Ryan; Bopp, Kara

2009-01-01

15

Practical Thermal Evaluation Methods For HAC Fire Analysis In Type B Radiaoactive Material (RAM) Packages  

SciTech Connect

Title 10 of the United States Code of Federal Regulations Part 71 for the Nuclear Regulatory Commission (10 CFR Part 71.73) requires that Type B radioactive material (RAM) packages satisfy certain Hypothetical Accident Conditions (HAC) thermal design requirements to ensure package safety during accidental fire conditions. Compliance with thermal design requirements can be met by prototype tests, analyses only or a combination of tests and analyses. Normally, it is impractical to meet all the HAC using tests only and the analytical methods are too complex due to the multi-physics non-linear nature of the fire event. Therefore, a combination of tests and thermal analyses methods using commercial heat transfer software are used to meet the necessary design requirements. The authors, along with his other colleagues at Savannah River National Laboratory in Aiken, SC, USA, have successfully used this 'tests and analyses' approach in the design and certification of several United States' DOE/NNSA certified packages, e.g. 9975, 9977, 9978, 9979, H1700, and Bulk Tritium Shipping Package (BTSP). This paper will describe these methods and it is hoped that the RAM Type B package designers and analysts can use them for their applications.

Abramczyk, Glenn; Hensel, Stephen J; Gupta, Narendra K.

2013-03-28

16

A Critical Analysis of SocINDEX and Sociological Abstracts Using an Evaluation Method for the Practicing Bibliographer  

ERIC Educational Resources Information Center

This study provides a database evaluation method for the practicing bibliographer that is more than a brief review yet less than a controlled experiment. The author establishes evaluation criteria in the context of the bibliographic instruction provided to meet the research requirements of undergraduate sociology majors at Queens College, City…

Mellone, James T.

2010-01-01

17

Qualitative Approaches to Mixed Methods Practice  

ERIC Educational Resources Information Center

This article discusses how methodological practices can shape and limit how mixed methods is practiced and makes visible the current methodological assumptions embedded in mixed methods practice that can shut down a range of social inquiry. The article argues that there is a "methodological orthodoxy" in how mixed methods is practiced that…

Hesse-Biber, Sharlene

2010-01-01

18

Insight into Evaluation Practice: A Content Analysis of Designs and Methods Used in Evaluation Studies Published in North American Evaluation-Focused Journals  

ERIC Educational Resources Information Center

To describe the recent practice of evaluation, specifically method and design choices, the authors performed a content analysis on 117 evaluation studies published in eight North American evaluation-focused journals for a 3-year period (2004-2006). The authors chose this time span because it follows the scientifically based research (SBR)…

Christie, Christina A.; Fleischer, Dreolin Nesbitt

2010-01-01

19

Recommended practices in global sensitivity analysis  

Microsoft Academic Search

Practices for global sensitivity analysis of model output are described in a recent textbook (Saltelli et al., 2007). These include (i) variance based techniques for general use in modelling, (ii) the elementary effect method for\\u000a factor screening for factors-rich models and (iii) Monte Carlo filtering. In the present work we try to put the practices\\u000a into the context of their

Andrea Saltelli; Daniele Vidoni; Massimiliano Mascherini

20

Thermal resistance analysis by induced transient (TRAIT) method for power electronic devices thermal characterization. II. Practice and experiments  

Microsoft Academic Search

For pt.I see ibid., vol.13, no.6, p.1208-19 (1998). The TRAIT method for thermal characterization of electronic devices, whose theory was exposed in part I for one-dimensional (1-D) structures, was here applied to systems having heat fluxes with three-dimensional (3-D) dependence in order to demonstrate that the spatial resolution of the thermal resistance analysis is still qualitatively maintained in this type

Paolo Emilio Bagnoli; Claudio Casarosa; Enrico Dallago; Marco Nardoni

1998-01-01

21

Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory-- Determination of Dissolved Organic Carbon in Water by High Temperature Catalytic Oxidation, Method Validation, and Quality-Control Practices  

USGS Publications Warehouse

An analytical method has been developed for the determination of dissolved organic carbon concentration in water samples. This method includes the results of the tests used to validate the method and the quality-control practices used for dissolved organic carbon analysis. Prior to analysis, water samples are filtered to remove suspended particulate matter. A Shimadzu TOC-5000A Total Organic Carbon Analyzer in the nonpurgeable organic carbon mode is used to analyze the samples by high temperature catalytic oxidation. The analysis usually is completed within 48 hours of sample collection. The laboratory reporting level is 0.22 milligrams per liter.

Bird, Susan M.; Fram, Miranda S.; Crepeau, Kathryn L.

2003-01-01

22

Economic Analysis Applied to the Practice of Dentistry.  

National Technical Information Service (NTIS)

A general methodology for the management of a dental practice in general and, particularly, for one using Expanded Function Dental Auxiliaries is presented. The focus of this paper is on the economic analysis of practice management and presents two method...

A. Reisman H. Emmons E. Green R. Occhionero K. Dadachanji

1976-01-01

23

Analysis of release kinetics of ocular therapeutics from drug releasing contact lenses: Best methods and practices to advance the field.  

PubMed

Several methods have been proposed to achieve an extended and controlled release of ocular therapeutics via contact lenses; however, the experimental conditions used to study the drug release vary greatly and significantly influence the release kinetics. In this paper, we examine variations in the release conditions and their effect on the release of both hydrophilic and hydrophobic drugs (ketotifen fumarate, diclofenac sodium, timolol maleate and dexamethasone) from conventional hydrogel and silicone hydrogel lenses. Drug release was studied under different conditions, varying volume, mixing rates, and temperature. Volume had the biggest effect on the release profile, which ironically is the least consistent variable throughout the literature. When a small volume (2-30mL) was used with no forced mixing and solvent exchange every 24h, equilibrium was reached promptly much earlier than solvent exchange, significantly damping the drug release rate and artificially extending the release duration, leading to false conclusions. Using a large volume (200-400mL) with a 30rpm mixing rate and no solvent exchange, the release rate and total mass released was significantly increased. In general, the release performed in small volumes with no force mixing exhibited cumulative mass release amounts of 3-12 times less than the cumulative release amounts in large volumes with mixing. Increases in mixing rate and temperature resulted in relatively small increases of 1.4 and 1.2 times, respectively in fractional mass released. These results strongly demonstrate the necessity of proper and thorough analysis of release data to assure that equilibrium is not affecting release kinetics. This is paramount for comparison of various controlled drug release methods of therapeutic contact lenses, validation of the potential of lenses as an efficient and effective means of drug delivery, as well as increasing the likelihood of only the most promising methods reaching in vivo studies. PMID:24894544

Tieppo, Arianna; Boggs, Aarika C; Pourjavad, Payam; Byrne, Mark E

2014-08-01

24

ISO 14 001 at the farm level: Analysis of five methods for evaluating the environmental impact of agricultural practices  

Microsoft Academic Search

Faced with society's increasing expectations, the Common Agricultural Policy (CAP) review considers environmental management to be an ever more critical criterion in the allocation of farm subsidies. With the goal of evaluating the environmental friendliness of farm practices, France's agricultural research and extension services have built a range of agricultural\\/environmental diagnostic tools over recent years.The objective of the present paper

M. B. Galan; D. Peschard; H. Boizard

2007-01-01

25

Building environmental assessment methods: assessing construction practices  

Microsoft Academic Search

This paper focuses on the environmental issues associated with the building construction process and the way in which they are currently represented in building environmental assessment methods. The primary goal is to identify the practical and methodological reasons for their scant inclusion and to offer arguments to redress this situation. Despite the difficulties of assessing management practices, their inclusion within

Raymond J. Cole

2000-01-01

26

Cyclic spectral analysis in practice  

NASA Astrophysics Data System (ADS)

This paper addresses the spectral analysis of cyclostationary (CS) signals from a generic point of view, with the aim to provide the practical conditions of success in a wide range of applications, such as in mechanical vibrations and acoustics. Specifically, it points out the similarities, differences and potential pitfalls associated with cyclic spectral analysis as opposed to classical spectral analysis. It is shown that non-parametric cyclic spectral estimators can all be derived from a general quadratic form, which yields as particular cases "cyclic" versions of the smoothed, averaged, and multitaper periodograms. The performance of these estimators is investigated in detail on the basis of their frequency resolution, cyclic leakage, systematic and stochastic estimation errors. The results are then extended to more advanced spectral quantities such as the cyclic coherence function and the Wigner-Ville spectrum of CS signals. In particular an optimal estimator of the Wigner-Ville spectrum is found, with remarkable properties. Several examples of cyclic spectral analyses, with an emphasis on mechanical systems, are finally presented in order to illustrate the value of such a general treatment for practical applications.

Antoni, Jérôme

2007-02-01

27

Practical usability methods in website design  

Microsoft Academic Search

This tutorial presents a practical approach to applying usability methods to website design. Website projects are usually done on tight schedules, with limited resources, and without a well-defined approach for achieving usability. For many developers it's easy to dismiss usability methods as an unnecessary overhead cost. We demonstrate how usability methods can be integrated efficiently and effectively into each stage

Darren Gergle; Tom Brinck; Scott Wood

1999-01-01

28

Breastfeeding practices: does method of delivery matter?  

PubMed

Objective of this study was to assess the relationship between method of delivery and breastfeeding. Using data (2005-2006) from the longitudinal Infant Feeding Practices Study II (n = 3,026) we assessed the relationship between delivery method (spontaneous vaginal, induced vaginal, emergency cesarean, and planned cesarean) and breastfeeding: initiation, any breastfeeding at 4 weeks, any breastfeeding at 6 months, and overall duration. We used SAS software to analyze data using multivariable analyses adjusting for several confounders, including selected demographic characteristics, participants' pre-delivery breastfeeding intentions and attitude, and used event-history analysis to estimate breastfeeding duration by delivery method. We found no significant association between delivery method and breastfeeding initiation. In the fully adjusted models examining breastfeeding duration to 4 weeks with spontaneous vaginal delivery group as the reference, those with induced vaginal deliveries were significantly less likely to breastfeed [adjusted odds ratio (AOR) = 0.53; 95 % CI = 0.38-0.71]; and no significant relationship was observed for those who had planned or emergency cesarean deliveries. Again, compared with spontaneous vaginal delivery group, those with induced vaginal [AOR = 0.60; 96 % CI = 0.47-0.78] and emergency cesarean [AOR = 0.68; 96 % CI = 0.48-0.95] deliveries were significantly less likely to breastfeed at 6 months. Median breastfeeding duration was 45.2 weeks among women with spontaneous vaginal, 38.7 weeks among planned cesarean, 25.8 weeks among induced vaginal and 21.5 weeks among emergency cesarean deliveries. While no significant association was observed between delivery method and breastfeeding initiation; breastfeeding duration varied substantially with method of delivery, perhaps indicating a need for additional support for women with assisted deliveries. PMID:22926268

Ahluwalia, Indu B; Li, Ruowei; Morrow, Brian

2012-12-01

29

A Practical Method for Watermarking Java Programs  

Microsoft Academic Search

Java programs distributed through Internet are now suffering from program theft. It is because Java programs can be easily decomposed into reusable class files and even decompiled into source code by program users. In this paper we propose a practical method that discourages program theft by embedding Java programs with a digital watermark. Embedding a program developer's copyright notation as

Akito Monden; Hajimu Iida; Ken-ichi Matsumoto; Koji Torii; Katsuro Inoue

2000-01-01

30

A Method for Optimizing Waste Management and Disposal Practices Using a Group-Based Uncertainty Model for the Analysis of Characterization Data - 13191  

SciTech Connect

It is a universal requirement for characterization of radioactive waste, that the consignor shall calculate and report a Total Measurement Uncertainty (TMU) value associated with each of the measured quantities such as nuclide activity. For Non-destructive Assay systems, the TMU analysis is typically performed on an individual container basis. However, in many cases, the waste consignor treats, transports, stores and disposes of containers in groups for example by over-packing smaller containers into a larger container or emplacing containers into groups for final disposal. The current standard practice for container-group data analysis is usually to treat each container as independent and uncorrelated and use a simple summation / averaging method (or in some cases summation of TMU in quadrature) to define the overall characteristics and associated uncertainty in the container group. In reality, many groups of containers are assayed on the same system, so there will be a large degree of co-dependence in the individual uncertainty elements. Many uncertainty terms may be significantly reduced when addressing issues such as the source position and variability in matrix contents over large populations. The systematic terms encompass both inherently 'two-directional' random effects (e.g. variation of source position) and other terms that are 'one-directional' i.e. designed to account for potential sources of bias. An analysis has been performed with population groups of a variety of non-destructive assay platforms in order to define a quantitative mechanism for waste consignors to determine overall TMU for batches of containers that have been assayed on the same system. (authors)

Simpson, A.; Clapham, M.; Lucero, R.; West, J. [Pajarito Scientific Corporation, 2976 Rodeo Park Drive East, Santa Fe, NM 87505 (United States)] [Pajarito Scientific Corporation, 2976 Rodeo Park Drive East, Santa Fe, NM 87505 (United States)

2013-07-01

31

Evaluation of agricultural best-management practices in the Conestoga River headwaters, Pennsylvania; methods of data collection and analysis and description of study areas  

USGS Publications Warehouse

The U.S. Geological Survey is conducting a water quality study as part of the nationally implemented Rural Clean Water Program in the headwaters of the Conestoga River, Pennsylvania. The study, which began in 1982, was designed to determine the effect of agricultural best management practices on surface--and groundwater quality. The study was concentrated in four areas within the intensively farmed, carbonate rock terrane located predominately in Lancaster County, Pennsylvania. These areas were divided into three monitoring components: (1) a Regional study area (188 sq mi): (2) a Small Watershed study area (5.82 sq mi); and (3) two field site study areas, Field-Site 1 (22.1 acres) and Field 2 (47.5 acres). The type of water quality data and the methods of data collection and analysis are presented. The monitoring strategy and description of the study areas are discussed. The locations and descriptions for all data collection locations at the four study areas are provided. (USGS)

Chichester, Douglas C.

1988-01-01

32

A practical application of wavelet moment method on the quantitative analysis of Shuanghuanglian oral liquid based on three-dimensional fingerprint spectra.  

PubMed

The overlapping and shifts of peaks and noise signals appear mostly in high performance liquid chromatography (HPLC) experiments. A practical application of wavelet moment method on the quantitative analysis of the main active components in Shuanghuanglian oral liquid samples was presented based on the determination of HPLC coupled with photodiode array detector (PAD). The wavelet moments were calculated from the divided regions in the grayscale images of three-dimensional (3D) HPLC-PAD fingerprint spectra according to the target peak(s), and then used to establish linear models, respectively. The correlation coefficients (R) were more than 0.9980 within the test ranges. The intra- and inter-day variations were less than 1.13% and 1.10%, respectively. The recovery ranged from 96.2% to 102.7%. The overall LODs and LOQs were less than 0.2?g/mL and 0.7?g/mL, respectively. Our study indicated that wavelet moment approach could defuse the overlapping and shifts of peaks and noise signals in the chromatographic determination owing to its multi-resolution and inherently invariance properties. Thus the analytical time was shortened, and the obtained results were reliable and accurate. PMID:24913368

Chen, Jing; Li, Bao Qiong; Zhai, Hong Lin; Lü, Wen Juan; Zhang, Xiao Yun

2014-07-25

33

The Sherlock Holmes method in clinical practice.  

PubMed

This article lists the integral elements of the Sherlock Holmes method, which is based on the intelligent collection of information through detailed observation, careful listening and thorough examination. The information thus obtained is analyzed to develop the main and alternative hypotheses, which are shaped during the deductive process until the key leading to the solution is revealed. The Holmes investigative method applied to clinical practice highlights the advisability of having physicians reason through and seek out the causes of the disease with the data obtained from acute observation, a detailed review of the medical history and careful physical examination. PMID:24457141

Sopeña, B

2014-04-01

34

Visionlearning: Research Methods: The Practice of Science  

NSDL National Science Digital Library

This instructional module introduces four types of research methods: experimentation, description, comparison, and modeling. It was developed to help learners understand that the classic definition of the "scientific method" does not capture the dynamic nature of science investigation. As learners explore each methodology, they develop an understanding of why scientists use multiple methods to gather data and develop hypotheses. It is appropriate for introductory physics courses and for teachers seeking content support in research practices. Editor's Note: Secondary students often cling to the notion that scientific research follows a stock, standard "scientific method". They may be unaware of the differences between experimental research, correlative studies, observation, and computer-based modeling research. In this resource, they can glimpse each methodology in the context of a real study done by respected scientists. This resource is part of Visionlearning, an award-winning set of classroom-tested modules for science education.

Carpi, Anthony; Egger, Anne

35

Practical Considerations for Using Exploratory Factor Analysis in Educational Research  

ERIC Educational Resources Information Center

The uses and methodology of factor analysis are widely debated and discussed, especially the issues of rotational use, methods of confirmatory factor analysis, and adequate sample size. The variety of perspectives and often conflicting opinions can lead to confusion among researchers about best practices for using factor analysis. The focus of the…

Beavers, Amy S.; Lounsbury, John W.; Richards, Jennifer K.; Huck, Schuyler W.; Skolits, Gary J.; Esquivel, Shelley L.

2013-01-01

36

Practical method for balancing airplane moments  

NASA Technical Reports Server (NTRS)

The present contribution is the sequel to a paper written by Messrs. R. Fuchs, L. Hopf, and H. Hamburger, and proposes to show that the methods therein contained can be practically utilized in computations. Furthermore, the calculations leading up to the diagram of moments for three airplanes, whose performance in war service gave reason for complaint, are analyzed. Finally, it is shown what conclusions can be drawn from the diagram of moments with regard to the defects in these planes and what steps may be taken to remedy them.

Hamburger, H

1924-01-01

37

Product Line Analysis: A Practical Introduction.  

National Technical Information Service (NTIS)

Product line analysis applies established modeling techniques to engineer the requirements for a product line of software-intensive systems. This report provides a practical introduction to product line requirements modeling. It describes product line ana...

G. Chastek, P. Donohoe, K. C. Kang, S. Thiel

2001-01-01

38

Practical applications of spectral analysis to hydrologic time series  

Microsoft Academic Search

An Erratum has been published for this article in Hydrological Processes 17(4) 2003, 883.Fourier-transform-based spectral analysis and filtering techniques, although potentially very useful, have seen little practical application in hydrology. We provide an overview of the Fourier transform and spectral analysis and present examples of how these methods may be applied to practical hydrologic problems: determination of the frequency content

Sean W. Fleming; A. Marsh Lavenue; Alaa H. Aly; Alison Adams

2002-01-01

39

Nanodispersed powders: Synthesis methods and practical applications  

Microsoft Academic Search

A comparative analysis of the methods and technologies used to produce nanopowders is performed. The fundamentals of both\\u000a the traditional technology for chemically synthesizing inorganic nanocrystalline particles of metal oxides by the crystallization\\u000a of precursors (amorphous metal hydroxides obtained by chemical coprecipitation from salts) and the most promising methods\\u000a for producing nanopowders and materials on their basis (including the dispersion

P. A. Storozhenko; Sh. L. Guseinov; S. I. Malashin

2009-01-01

40

Analysis of Overseas Shipping Practices.  

National Technical Information Service (NTIS)

This analysis addresses the issue of whether shipping freight to the Military Traffic Management Command's (MTMC) Container Stuffing Activities (CSAs) would be a more cost effective way for the Defense Logistics Agency (DLA) to containerize cargo for surf...

M. Kleinhenz

1994-01-01

41

Practical Analysis of Nutritional Data  

NSDL National Science Digital Library

This online textbook, created by faculty members at Tulane University, provides information on the statistical analysis of nutritional data. Techniques covered include data cleaning, descriptive statistics, histograms, graphics, scatterplots, outlier identification, regression and correlation, confounding, and interactions. Each chapter includes exercises with real data and self-tests to be used with SPSS. Additionally, the site contains information on using SPSS for statistical testing, the basics of Epi-info, and the basics of Stat Analysis. The programs requires a operating system of Windows 95 or later.

2009-03-12

42

Commonality Analysis: A Practical Example.  

ERIC Educational Resources Information Center

Commonality analysis was used to look for school effects in gains in reading test scores for 877 fourth to sixth grade children in Elementary Secondary Education Act Title I remedial reading programs. The four groups of predictor variables that were investigated were background, mental ability, parental involvement, and school program. Commonality…

DeVito, Pasquale J.

43

Systemic accident analysis: examining the gap between research and practice.  

PubMed

The systems approach is arguably the dominant concept within accident analysis research. Viewing accidents as a result of uncontrolled system interactions, it forms the theoretical basis of various systemic accident analysis (SAA) models and methods. Despite the proposed benefits of SAA, such as an improved description of accident causation, evidence within the scientific literature suggests that these techniques are not being used in practice and that a research-practice gap exists. The aim of this study was to explore the issues stemming from research and practice which could hinder the awareness, adoption and usage of SAA. To achieve this, semi-structured interviews were conducted with 42 safety experts from ten countries and a variety of industries, including rail, aviation and maritime. This study suggests that the research-practice gap should be closed and efforts to bridge the gap should focus on ensuring that systemic methods meet the needs of practitioners and improving the communication of SAA research. PMID:23542136

Underwood, Peter; Waterson, Patrick

2013-06-01

44

An Online Forum As a Qualitative Research Method: Practical Issues  

PubMed Central

Background Despite positive aspects of online forums as a qualitative research method, very little is known about practical issues involved in using online forums for data collection, especially for a qualitative research project. Objectives The purpose of this paper is to describe the practical issues that the researchers encountered in implementing an online forum as a qualitative component of a larger study on cancer pain experience. Method Throughout the study process, the research staff recorded issues ranged from minor technical problems to serious ethical dilemmas as they arose and wrote memos about them. The memos and written records of discussions were reviewed and analyzed using the content analysis suggested by Weber. Results Two practical issues related to credibility were identified: a high response and retention rate and automatic transcripts. An issue related to dependability was the participants’ easy forgetfulness. The issues related to confirmability were difficulties in theoretical saturation and unstandardized computer and Internet jargon. A security issue related to hacking attempts was noted as well. Discussion The analysis of these issues suggests several implications for future researchers who want to use online forums as a qualitative data collection method.

Im, Eun-Ok; Chee, Wonshik

2008-01-01

45

A collection of research reporting, theoretical analysis, and practical applications in science education: Examining qualitative research methods, action research, educator-researcher partnerships, and constructivist learning theory  

NASA Astrophysics Data System (ADS)

Educator-researcher partnerships are increasingly being used to improve the teaching of science. Chapter 1 provides a summary of the literature concerning partnerships, and examines the justification of qualitative methods in studying these relationships. It also justifies the use of Participatory Action Research (PAR). Empirically-based studies of educator-researcher partnership relationships are rare despite investments in their implementation by the National Science Foundation (NSF) and others. Chapter 2 describes a qualitative research project in which participants in an NSF GK-12 fellowship program were studied using informal observations, focus groups, personal interviews, and journals to identify and characterize the cultural factors that influenced the relationships between the educators and researchers. These factors were organized into ten critical axes encompassing a range of attitudes, behaviors, or values defined by two stereotypical extremes. These axes were: (1) Task Dictates Context vs. Context Dictates Task; (2) Introspection vs. Extroversion; (3) Internal vs. External Source of Success; (4) Prior Planning vs. Implementation Flexibility; (5) Flexible vs. Rigid Time Sense; (6) Focused Time vs. Multi-tasking; (7) Specific Details vs. General Ideas; (8) Critical Feedback vs. Encouragement; (9) Short Procedural vs. Long Content Repetition; and (10) Methods vs. Outcomes are Well Defined. Another ten important stereotypical characteristics, which did not fit the structure of an axis, were identified and characterized. The educator stereotypes were: (1) Rapport/Empathy; (2) Like Kids; (3) People Management; (4) Communication Skills; and (5) Entertaining. The researcher stereotypes were: (1) Community Collaboration; (2) Focus Intensity; (3) Persistent; (4) Pattern Seekers; and (5) Curiosity/Skeptical. Chapter 3 summarizes the research presented in chapter 2 into a practical guide for participants and administrators of educator-researcher partnerships. Understanding how to identify and evaluate constructivist lessons is the first step in promoting and improving constructivism in teaching. Chapter 4 summarizes a theoretically-generated series of practical criteria that define constructivism: (1) Eliciting Prior Knowledge, (2) Creating Cognitive Dissonance, (3) Application of New Knowledge with Feedback, and (4) Reflection on Learning, or Metacognition. These criteria can be used by any practitioner to evaluate the level of constructivism used in a given lesson or activity.

Hartle, R. Todd

46

PRACTICAL STEREOLOGICAL METHODS FOR MORPHOMETRIC CYTOLOGY  

Microsoft Academic Search

Stereological principles provide efficient and reliable tools for the determination of quantita- tivc parameters of tissue structure on sections. Some principles which allow the estimation of volumetric ratios, surface areas, surface-to-volume ratios, thicknesses of tissue or cell shccts, and the number of structures are reviewed and presented in general form; means for their practical application in electron microscopy are outlined.

EWALD R. WEIBEL; S. KISTLER; WALTER F. SCHERLE

2009-01-01

47

Science Teaching Methods: A Rationale for Practices  

ERIC Educational Resources Information Center

This article is a version of the talk given by Jonathan Osborne as the Association for Science Education (ASE) invited lecturer at the National Science Teachers' Association Annual Convention in San Francisco, USA, in April 2011. The article provides an explanatory justification for teaching about the practices of science in school science that…

Osborne, Jonathan

2011-01-01

48

PRACTICAL STEREOLOGICAL METHODS FOR MORPHOMETRIC CYTOLOGY  

PubMed Central

Stereological principles provide efficient and reliable tools for the determination of quantitative parameters of tissue structure on sections. Some principles which allow the estimation of volumetric ratios, surface areas, surface-to-volume ratios, thicknesses of tissue or cell sheets, and the number of structures are reviewed and presented in general form; means for their practical application in electron microscopy are outlined. The systematic and statistical errors involved in such measurements are discussed.

Weibel, Ewald R.; Kistler, Gonzague S.; Scherle, Walter F.

1966-01-01

49

The Construction of Practical General Linear Methods  

Microsoft Academic Search

Using the property of ‘inherent Runge—Kutta stability’, it is possible to construct diagonally implicit general linear methods with stability regions exactly the same as for Runge—Kutta methods. In addition to A-stable methods found in this way, it is also possible to construct explicit methods with stability regions identical to those of explicit Runge—Kutta methods. The use of doubly companion matrices

J. C. Butcher; W. M. Wright

2003-01-01

50

Method of analysis and quality-assurance practices by the U. S. Geological Survey Organic Geochemistry Research Group; determination of four selected mosquito insecticides and a synergist in water using liquid-liquid extraction and gas chrom  

USGS Publications Warehouse

A method of analysis and quality-assurance practices were developed for the determination of four mosquito insecticides (malathion, metho-prene, phenothrin, and resmethrin) and one synergist (piperonyl butoxide) in water. The analytical method uses liquid-liquid extraction (LLE) and gas chromatography/mass spectrometry (GC/MS). Good precision and accuracy were demonstrated in reagent water, urban surface water, and ground water. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 50 nanograms per liter ranged from 68 to 171 percent, with standard deviations in concentrations of 27 nanograms per liter or less. The method detection limit for all compounds was 5.9 nanograms per liter or less for 247-milliliter samples. This method is valuable for acquiring information about the fate and transport of these mosquito insecticides and one synergist in water.

Zimmerman, L. R.; Strahan, A. P.; Thurman, E. M.

2001-01-01

51

The 5-Step Method: Principles and Practice  

ERIC Educational Resources Information Center

This article includes a description of the 5-Step Method. First, the origins and theoretical basis of the method are briefly described. This is followed by a discussion of the general principles that guide the delivery of the method. Each step is then described in more detail, including the content and focus of each of the five steps that include:…

Copello, Alex; Templeton, Lorna; Orford, Jim; Velleman, Richard

2010-01-01

52

Qualitative data analysis: conceptual and practical considerations.  

PubMed

Qualitative inquiry requires that collected data is organised in a meaningful way, and this is referred to as data analysis. Through analytic processes, researchers turn what can be voluminous data into understandable and insightful analysis. This paper sets out the different approaches that qualitative researchers can use to make sense of their data including thematic analysis, narrative analysis, discourse analysis and semiotic analysis and discusses the ways that qualitative researchers can analyse their data. I first discuss salient issues in performing qualitative data analysis, and then proceed to provide some suggestions on different methods of data analysis in qualitative research. Finally, I provide some discussion on the use of computer-assisted data analysis. PMID:19642962

Liamputtong, Pranee

2009-08-01

53

Method of analysis at the U.S. Geological Survey California Water Science Center, Sacramento Laboratory - determination of haloacetic acid formation potential, method validation, and quality-control practices  

USGS Publications Warehouse

An analytical method for the determination of haloacetic acid formation potential of water samples has been developed by the U.S. Geological Survey California Water Science Center Sacramento Laboratory. The haloacetic acid formation potential is measured by dosing water samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine. The haloacetic acids formed are bromochloroacetic acid, bromodichloroacetic acid, dibromochloroacetic acid, dibromoacetic acid, dichloroacetic acid, monobromoacetic acid, monochloroacetic acid, tribromoacetic acid, and trichloroacetic acid. They are extracted, methylated, and then analyzed using a gas chromatograph equipped with an electron capture detector. Method validation experiments were performed to determine the method accuracy, precision, and detection limit for each of the compounds. Method detection limits for these nine haloacetic acids ranged from 0.11 to 0.45 microgram per liter. Quality-control practices include the use of blanks, quality-control samples, calibration verification standards, surrogate recovery, internal standard, matrix spikes, and duplicates.

Zazzi, Barbara C.; Crepeau, Kathryn L.; Fram, Miranda S.; Bergamaschi, Brian A.

2005-01-01

54

Practice-Near and Practice-Distant Methods in Human Services Research  

ERIC Educational Resources Information Center

This article discusses practice-near research in human services, a cluster of methodologies that may include thick description, intensive reflexivity, and the study of emotional and relational processes. Such methods aim to get as near as possible to experiences at the relational interface between institutions and the practice field.…

Froggett, Lynn; Briggs, Stephen

2012-01-01

55

Practical reconstruction method for bioluminescence tomography  

Microsoft Academic Search

Bioluminescence tomography (BLT) is used to localize and quantify bioluminescent sources in a small living animal. By advancing bioluminescent imaging to a tomographic framework, it helps to diagnose diseases, monitor therapies and facilitate drug development. In this paper, we establish a direct linear relationship between measured surface photon density and an unknown bioluminescence source distribution by using a finite-element method

Wenxiang Cong; Ge Wang; Durairaj Kumar; Yi Liu; Ming Jiang; Lihong V. Wang; Eric A. Hoffman; Geoffrey McLennan; Paul B. McCray; Joseph Zabner; Alexander Cong

2005-01-01

56

Stochastic Methods for Practical Global Optimization  

Microsoft Academic Search

Engineering design problems often involve global optimization of functions that are supplied as ‘black box’ functions. These functions may be nonconvex, nondifferentiable and even discontinuous. In addition, the decision variables may be a combination of discrete and continuous variables. The functions are usually computationally expensive, and may involve finite element methods. An engineering example of this type of problem is

Zelda B. Zabinsky

1998-01-01

57

Optimizing Distributed Practice: Theoretical Analysis and Practical Implications  

ERIC Educational Resources Information Center

More than a century of research shows that increasing the gap between study episodes using the same material can enhance retention, yet little is known about how this so-called distributed practice effect unfolds over nontrivial periods. In two three-session laboratory studies, we examined the effects of gap on retention of foreign vocabulary,…

Cepeda, Nicholas J.; Coburn, Noriko; Rohrer, Doug; Wixted, John T.; Mozer, Michael C,; Pashler, Harold

2009-01-01

58

Multi-criteria decision analysis: Limitations, pitfalls, and practical difficulties  

SciTech Connect

The 2002 Winter Olympics women's figure skating competition is used as a case study to illustrate some of the limitations, pitfalls, and practical difficulties of Multi-Criteria Decision Analysis (MCDA). The paper compares several widely used models for synthesizing the multiple attributes into a single aggregate value. The various MCDA models can provide conflicting rankings of the alternatives for a common set of information even under states of certainty. Analysts involved in MCDA need to deal with the following challenging tasks: (1) selecting an appropriate analysis method, and (2) properly interpreting the results. An additional trap is the availability of software tools that implement specific MCDA models that can beguile the user with quantitative scores. These conclusions are independent of the decision domain and they should help foster better MCDA practices in many fields including systems engineering trade studies.

Kujawski, Edouard

2003-02-01

59

Deepen the GIS spatial analysis theory studying through the gradual process of practice  

NASA Astrophysics Data System (ADS)

Spatial analysis is the key content of GIS basic theory course. In this paper, the importance of practice teaching for GIS spatial analysis theory studying and its implementation method are discussed combined with practice teaching arrangement of spatial analysis in the course "GIS theory and practice" based on the basic principle of procedural teaching theory and its teaching model. In addition, the concrete gradual practice process is mentioned in four aspects. By this way, the GIS spatial analysis theory studying can be deepened and the cultivation of students' comprehensive ability of Geography Science can be strengthened.

Yi, Y. G.; Liu, H. P.; Liu, X. P.

2014-04-01

60

Surface analysis methods  

SciTech Connect

This workshop proposes a strategy for improving measurements of dry deposition so that the uncertainty in dry deposition is no longer the dominant uncertainty regarding model predictions of the fate and effects of dry deposition. This near term need of National Acid Precipitation Assessment Program provided guidance for the activities of the group during the workshop. The objectives of the panel on surface analysis methods were as follows: (1) Critique various surface analysis approaches for their capabilities in routine monitoring and research; and determine their limitations, with respect to addressable chemical species, accuracy, and results expected by 1990; (2) develop a program for the remainder of NAPAP to address needs in dry deposition measurements, and an extension of this program beyond NAPAP in the general area of atmosphere/canopy exchange processes.

Lindberg, S.E.; Davidson, C.I.; Bondietti, E.A.; Graustein, W.C.; Livingston, R.; Lovett, G.; Peters, J.

1986-01-01

61

Constructivist Methods of Marital and Family Therapy: A Practical Precis.  

ERIC Educational Resources Information Center

Provides practical review of selected methods of counseling from constructivist orientation: use of repertory grid technique, metaphorical constructions, family transaction procedure, system "bow ties," and various enactment procedures. Provides examples of methods in counseling contexts and further references to additional illustrations and…

Neimeyer, Greg J.; Neimeyer, Robert A.

1994-01-01

62

A Practical Method of Constructing Quantum Combinational Logic Circuits  

Microsoft Academic Search

We describe a practical method of constructing quantum combinational logic circuits with basic quantum logic gates such as NOT and general $n$-bit Toffoli gates. This method is useful to find the quantum circuits for evaluating logic functions in the form most appropriate for implementation on a given quantum computer. The rules to get the most efficient circuit are utilized best

Jae-Seung Lee; Yongwook Chung; Jaehyun Kim; Soonchil Lee

1999-01-01

63

A collection of research reporting, theoretical analysis, and practical applications in science education: Examining qualitative research methods, action research, educator-researcher partnerships, and constructivist learning theory  

Microsoft Academic Search

Educator-researcher partnerships are increasingly being used to improve the teaching of science. Chapter 1 provides a summary of the literature concerning partnerships, and examines the justification of qualitative methods in studying these relationships. It also justifies the use of Participatory Action Research (PAR). Empirically-based studies of educator-researcher partnership relationships are rare despite investments in their implementation by the National Science

R. Todd Hartle

2007-01-01

64

Aircraft accidents : method of analysis  

NASA Technical Reports Server (NTRS)

This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

1929-01-01

65

Methods of analysis and quality-assurance practices of the U.S. Geological Survey organic laboratory, Sacramento, California; determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry  

USGS Publications Warehouse

Analytical method and quality-assurance practices were developed for a study of the fate and transport of pesticides in the Sacramento-San Joaquin Delta and the Sacramento and San Joaquin River. Water samples were filtered to remove suspended parti- culate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide, and the pesticides were eluted with three 2-milliliter aliquots of hexane:diethyl ether (1:1). The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for analytes determined per 1,500-milliliter samples ranged from 0.006 to 0.047 microgram per liter. Recoveries ranged from 47 to 89 percent for 12 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.05 and 0.26 microgram per liter. The method was modified to improve the pesticide recovery by reducing the sample volume to 1,000 milliliters. Internal standards were added to improve quantitative precision and accuracy. The analysis also was expanded to include a total of 21 pesticides. The method detection limits for 1,000-milliliter samples ranged from 0.022 to 0.129 microgram per liter. Recoveries ranged from 38 to 128 percent for 21 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.10 and 0.75 microgram per liter.

Crepeau, Kathryn L.; Domagalski, Joseph L.; Kuivila, Kathryn M.

1994-01-01

66

Practicing the practice: Learning to guide elementary science discussions in a practice-oriented science methods course  

NASA Astrophysics Data System (ADS)

University methods courses are often criticized for telling pre-service teachers, or interns, about the theories behind teaching instead of preparing them to actually enact teaching. Shifting teacher education to be more "practice-oriented," or to focus more explicitly on the work of teaching, is a current trend for re-designing the way we prepare teachers. This dissertation addresses the current need for research that unpacks the shift to more practice-oriented approaches by studying the content and pedagogical approaches in a practice-oriented, masters-level elementary science methods course (n=42 interns). The course focused on preparing interns to guide science classroom discussions. Qualitative data, such as video records of course activities and interns' written reflections, were collected across eight course sessions. Codes were applied at the sentence and paragraph level and then grouped into themes. Five content themes were identified: foregrounding student ideas and questions, steering discussion toward intended learning goals, supporting students to do the cognitive work, enacting teacher role of facilitator, and creating a classroom culture for science discussions. Three pedagogical approach themes were identified. First, the teacher educators created images of science discussions by modeling and showing videos of this practice. They also provided focused teaching experiences by helping interns practice the interactive aspects of teaching both in the methods classroom and with smaller groups of elementary students in schools. Finally, they structured the planning and debriefing phases of teaching so interns could learn from their teaching experiences and prepare well for future experiences. The findings were analyzed through the lens of Grossman and colleagues' framework for teaching practice (2009) to reveal how the pedagogical approaches decomposed, represented, and approximated practice throughout course activities. Also, the teacher educators' purposeful use of both pedagogies of investigation (to study teaching) and pedagogies of enactment (to practice enacting teaching) was uncovered. This work provides insights for the design of courses that prepare interns to translate theories about teaching into the interactive work teachers actually do. Also, it contributes to building a common language for talking about the content of practice-oriented courses and for comparing the affordances and limitations of pedagogical approaches across teacher education settings.

Shah, Ashima Mathur

67

Social Network Analysis as an Analytic Tool for Interaction Patterns in Primary Care Practices  

PubMed Central

PURPOSE Social network analysis (SNA) provides a way of quantitatively analyzing relationships among people or other information-processing agents. Using 2 practices as illustrations, we describe how SNA can be used to characterize and compare communication patterns in primary care practices. METHODS Based on data from ethnographic field notes, we constructed matrices identifying how practice members interact when practice-level decisions are made. SNA software (UCINet and KrackPlot) calculates quantitative measures of network structure including density, centralization, hierarchy and clustering coefficient. The software also generates a visual representation of networks through network diagrams. RESULTS The 2 examples show clear distinctions between practices for all the SNA measures. Potential uses of these measures for analysis of primary care practices are described. CONCLUSIONS SNA can be useful for quantitative analysis of interaction patterns that can distinguish differences among primary care practices.

Scott, John; Tallia, Alfred; Crosson, Jesse C.; Orzano, A. John; Stroebel, Christine; DiCicco-Bloom, Barbara; O'Malley, Dena; Shaw, Eric; Crabtree, Benjamin

2005-01-01

68

Practical challenges in the method of controlled Lagrangians  

NASA Astrophysics Data System (ADS)

The method of controlled Lagrangians is an energy shaping control technique for underactuated Lagrangian systems. Energy shaping control design methods are appealing as they retain the underlying nonlinear dynamics and can provide stability results that hold over larger domain than can be obtained using linear design and analysis. The objective of this dissertation is to identify the control challenges in applying the method of controlled Lagrangians to practical engineering problems and to suggest ways to enhance the closed-loop performance of the controller. This dissertation describes a procedure for incorporating artificial gyroscopic forces in the method of controlled Lagrangians. Allowing these energy-conserving forces in the closed-loop system provides greater freedom in tuning closed-loop system performance and expands the class of eligible systems. In energy shaping control methods, physical dissipation terms that are neglected in the control design may enter the system in a way that can compromise stability. This is well illustrated through the "ball on a beam" example. The effect of physical dissipation on the closed-loop dynamics is studied in detail and conditions for stability in the presence of natural damping are discussed. The control technique is applied to the classic "inverted pendulum on a cart" system. A nonlinear controller is developed which asymptotically stabilizes the inverted equilibrium at a specific cart position for the conservative dynamic model. The region of attraction contains all states for which the pendulum is elevated above the horizontal plane. Conditions for asymptotic stability in the presence of linear damping are developed. The nonlinear controller is validated through experiments. Experimental cart damping is best modeled using static and Coulomb friction. Experiments show that static and Coulomb friction degrades the closed-loop performance and induces limit cycles. A Lyapunov-based switching controller is proposed and successfully implemented to suppress the limit cycle oscillations. The Lyapunov-based controller switches between the energy shaping nonlinear controller, for states away from the equilibrium, and a well-tuned linear controller, for states close to the equilibrium. The method of controlled Lagrangians is applied to vehicle systems with internal moving point mass actuators. Applications of moving mass actuators include certain spacecraft, atmospheric re-entry vehicles, and underwater vehicles. Control design using moving mass actuators is challenging; the system is often underactuated and multibody dynamic models are higher dimensional. We consider two examples to illustrate the application of controlled Lagrangian formulation. The first example is a spinning disk, a simplified, planar version of a spacecraft spin stabilization problem. The second example is a planar, streamlined underwater vehicle.

Chevva, Konda Reddy

69

A Practical Eye State Recognition Based Driver Fatigue Detection Method  

Microsoft Academic Search

Driving fatigue detection is a key technique in vehicle active safety. In this paper, a practical driver fatigue detection algorithm is proposed, it employs sequential detection and temporal tracking to detect human face, which combines the superiorities of both Adaboost and mean-shift algorithm; a morphologic filter method is given to localize the pair of eyes in the detected face area.

Huan Wang; Yong Cheng; Qiong Wang; Mingwu Ren; Chunxia Zhao; Jingyu Yang

2009-01-01

70

A Practical Method for Cable Failure Rate Modeling  

Microsoft Academic Search

As underground cables continue to increase in age, most utilities are experiencing an increase in underground cable failures. Since cable replacement programs are expensive, it is important to understand the impact that age and other cable characteristics have on cable failure rates. This paper presents a practical method to model cable failure rates categorized by cable features. It addresses the

Yujia Zhou; Richard E. Brown

2006-01-01

71

Practical methods for minimizing embedding impact in steganography  

NASA Astrophysics Data System (ADS)

In this paper, we propose a general framework and practical coding methods for constructing steganographic schemes that minimize the statistical impact of embedding. By associating a cost of an embedding change with every element of the cover, we first derive bounds on the minimum theoretically achievable embedding impact and then propose a framework to achieve it in practice. The method is based on syndrome codes with low-density generator matrices (LDGM). The problem of optimally encoding a message (e.g., with the smallest embedding impact) requires a binary quantizer that performs near the rate-distortion bound. We implement this quantizer using LDGM codes with a survey propagation message-passing algorithm. Since LDGM codes are guaranteed to achieve the rate-distortion bound, the proposed methods are guaranteed to achieve the minimal embedding impact (maximal embedding efficiency). We provide detailed technical description of the method for practitioners and demonstrate its performance on matrix embedding.

Fridrich, Jessica; Filler, Tomas

2007-02-01

72

[A method for the implementation and promotion of access to comprehensive and complementary primary healthcare practices].  

PubMed

The rendering of integrated and complementary practices in the Brazilian Unified Health System is fostered to increase the comprehensiveness of care and access to same, though it is a challenge to incorporate them into the services. Our objective is to provide a simple method of implementation of such practices in Primary Healthcare, derived from analysis of experiences in municipalities, using partial results of a master's thesis that employed research-action methodology. The method involves four stages: 1 - defininition of a nucleus responsible for implementation and consolidation thereof; 2 - situational analysis, with definition of the existing competent professionals; 3 - regulation, organization of access and legitimation; and 4 - implementation cycle: local plans, mentoring and ongoing education in health. The phases are described, justified and briefly discussed. The method encourages the development of rational and sustainable actions, sponsors participatory management, the creation of comprehensivenessand the broadening of care provided in Primary Healthcare by offering progressive and sustainable comprehensive and complementary practices. PMID:23175308

Santos, Melissa Costa; Tesser, Charles Dalcanale

2012-11-01

73

The Practical Modelling of Discontinuous Rock Masses with Finite Element Analysis  

Microsoft Academic Search

The mechanical response of rock masses to loading and excavation, especially in low stress environments, is significantly affected by discontinuities. This paper examines the practical modelling of rock mass problems with explicit representation of discontinuities using special joint elements in the Finite Element Method. The paper discusses why it is possible to use this approach for routine, practical engineering analysis

R. E. Hammah; T. Yacoub; B. Corkum; J. H. Curran

2008-01-01

74

ANALYSIS OF IMPLICIT LES METHODS  

Microsoft Academic Search

Implicit LES methods are numerical methods that capture the energy-containing and inertial ranges of turbulent flows, while relying on their own intrinsic dissi- pation to act as a subgrid model. We present a scheme-dependent Kolmogorov scaling analysis of the solutions produced by such methods. From this analysis we can define an effective Reynolds number for implicit LES simulations of inviscid

ANDREW ASPDEN; NIKOS NIKIFORAKIS; STUART DALZIEL; JOHN B. BELL

2008-01-01

75

Practical methods to improve the development of computational software  

SciTech Connect

The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

Osborne, A. G.; Harding, D. W.; Deinert, M. R. [Department of Mechanical Engineering, University of Texas, Austin (United States)] [Department of Mechanical Engineering, University of Texas, Austin (United States)

2013-07-01

76

Biometric Data Safeguarding Technologies Analysis and Best Practices.  

National Technical Information Service (NTIS)

This document is the Study Report for PSTP 02-0351BIO, Biometric Data Safeguarding Technologies Analysis and Best Practices. One of the main goals of the Public Security Technical Program (PSTP) Biometrics Community of Practice is to evaluate, analyze, an...

R. Nanavati

2011-01-01

77

An Analysis of Optometric Practices in Rural Alabama.  

ERIC Educational Resources Information Center

Twenty-nine Alabama optometric practices were studied using an optometrist survey, one-week patient flow analysis, and audit of patient records. Results indicate some special facets of the rural practices that may require a different kind of educational preparation. (MSE)

Wild, Bradford W.; Maisiak, Richard

1981-01-01

78

Multivariate analysis of 2-DE protein patterns--practical approaches.  

PubMed

Practical approaches to the use of multivariate data analysis of 2-DE protein patterns are demonstrated by three independent strategies for the image analysis and the multivariate analysis on the same set of 2-DE data. Four wheat varieties were selected on the basis of their baking quality. Two of the varieties were of strong baking quality and hard wheat kernel and two were of weak baking quality and soft kernel. Gliadins at different stages of grain development were analyzed by the application of multivariate data analysis on images of 2-DEs. Patterns related to the wheat varieties, harvest times and quality were detected on images of 2-DE protein patterns for all the three strategies. The use of the multivariate methods was evaluated in the alignment and matching procedures of 2-DE gels. All the three strategies were able to discriminate the samples according to quality, harvest time and variety, although different subsets of protein spots were selected. The explorative approach of using multivariate data analysis and variable selection in the analyses of 2-DEs seems to be promising as a fast, reliable and convenient way of screening and transforming many gel images into spot quantities. PMID:17351893

Jacobsen, Susanne; Grove, Harald; Jensen, Kristina Nedenskov; Sørensen, Helle A; Jessen, Flemming; Hollung, Kristin; Uhlen, Anne Kjersti; Jørgensen, Bo M; Faergestad, Ellen Mosleth; Søndergaard, Ib

2007-04-01

79

Practical analysis of welding processes using finite element analysis.  

SciTech Connect

With advances in commercially available finite element software and computational capability, engineers can now model large-scale problems in mechanics, heat transfer, fluid flow, and electromagnetics as never before. With these enhancements in capability, it is increasingly tempting to include the fundamental process physics to help achieve greater accuracy (Refs. 1-7). While this goal is laudable, it adds complication and drives up cost and computational requirements. Practical analysis of welding relies on simplified user inputs to derive important relativistic trends in desired outputs such as residual stress or distortion due to changes in inputs like voltage, current, and travel speed. Welding is a complex three-dimensional phenomenon. The question becomes how much modeling detail is needed to accurately predict relative trends in distortion, residual stress, or weld cracking? In this work, a HAZ (Heat Affected Zone) weld-cracking problem was analyzed to rank two different welding cycles (weld speed varied) in terms of crack susceptibility. Figure 1 shows an aerospace casting GTA welded to a wrought skirt. The essentials of part geometry, welding process, and tooling were suitably captured lo model the strain excursion in the HAZ over a crack-susceptible temperature range, and the weld cycles were suitably ranked. The main contribution of this work is the demonstration of a practical methodology by which engineering solutions to engineering problems may be obtained through weld modeling when time and resources are extremely limited. Typically, welding analysis suffers with the following unknowns: material properties over entire temperature range, the heat-input source term, and environmental effects. Material properties of interest are conductivity, specific heat, latent heat, modulus, Poisson's ratio, yield strength, ultimate strength, and possible rate dependencies. Boundary conditions are conduction into fixturing, radiation and convection to the environment, and any mechanical constraint. If conductivity, for example, is only known at a few temperatures it can be linearly extrapolated from the highest known temperature to the liquidus temperature. Over the liquidus to solidus temperature the conductivity is linearly increased by a factor of three to account for the enhanced heat transfer due to convection in the weld pool. Above the liquidus it is kept constant. Figure 2 shows an example of this type of approximation. Other thermal and mechanical properties and boundary conditions can be similarly approximated, using known physical material characteristics when possible. Sensitivity analysis can show that many assumptions have a small effect on the final outcome of the analysis. In the example presented in this work, simplified analysis procedures were used to model this process to understand why one set of parameters is superior to the other. From Lin (Ref. 8), mechanical strain is expected to drive HAZ cracking. Figure 3 shows a plot of principal tensile mechanical strain versus temperature during the welding process. By looking at the magnitudes of the tensile mechanical strain in the material's Brittle Temperature Region (BTR), it can be seen that on a relative basis the faster travel speed process that causes cracking results in about three times the strain in the temperature range of the BTR. In this work, a series of simplifying assumptions were used in order to quickly and accurately model a real welding process to respond to an immediate manufacturing need. The analysis showed that the driver for HAZ cracking, the mechanical strain in the BTR, was significantly higher in the process that caused cracking versus the process that did not. The main emphasis of the analysis was to determine whether there was a mechanical reason whether the improved weld parameters would consistently produce an acceptable weld, The prediction of the mechanical strain magnitudes confirms the better process.

Cowles, J. H. (John H.); Dave, V. R. (Vivek R.); Hartman, D. A. (Daniel A.)

2001-01-01

80

Practical aspects of operating a neutron activation analysis laboratory.  

National Technical Information Service (NTIS)

This book is intended to advise in everyday practical problems related to operating a neutron activation analysis (NAA) laboratory. It gives answers to questions like ''what to use NAA for'', ''how to find relevant research problems'', ''how to find users...

1990-01-01

81

A Practical Introduction to Analysis and Synthesis  

ERIC Educational Resources Information Center

Discusses an introductory chemical engineering course in which mathematical models are used to analyze experimental data. Concepts illustrated include dimensional analysis, scaleup, heat transfer, and energy conservation. (MLH)

Williams, R. D.; Cosart, W. P.

1976-01-01

82

A practical gait analysis system using gyroscopes  

Microsoft Academic Search

This study investigated the possibility of using uni-axial gyroscopes to develop a simple portable gait analysis system. Gyroscopes were attached on the skin surface of the shank and thigh segments and the angular velocity for each segment was recorded in each segment. Segment inclinations and knee angle were derived from segment angular velocities. The angular signals from a motion analysis

Kaiyu Tong; Malcolm H Granat

1999-01-01

83

Using discourse analysis to improve extension practice  

Microsoft Academic Search

This paper aims to create awareness of the potential of discourse analysis to be a valuable contribution to agricultural extension. By way of example, it also reports on the discourses about climate change identified as being present in the Tasmanian agricultural community. The paper outlines the theories of discourse analysis and presents the results from interviews with 63 farmers and

Aysha Fleming; Frank Vanclay

84

Practical remote control system using work point tracking method  

NASA Astrophysics Data System (ADS)

Recently, research using VR (Virtual Reality) technology has been ongoing in developing remote control systems used for working in dangerous or severe environments. However, some problems concerning recognition, operation and so on remain. In this study, the authors propose a practical remote control method which solves some of these problems. First, the system has more than two stereoscopic cameras which are used to see the robot's hand, objects and so on. The reason for this is that the shadows behind the objects or other objects in only one field of vision decrease the amount of sufficient information needed. The method of showing these views to the operator is by switching the views on one display. Secondly, operation using the view coordinate system (the view oriented operation method) is adopted to solve the problem of operation confusion. With this method, an device coordinate system is made to coincide with a view coordinate system, this helps the operator to recognize the operation direction according to the operator's own will. Thirdly, a work point tracking method is adopted not to lose sight of the object when switching views. In experimenting and testing, several kinds of tasks were performed to confirm the effectiveness of these methods. Using the above methods, these tasks could be performed faster than before, and moreover, the operator could operate the robot more dexterously. In conclusion, these methods can create a more efficient remote control system.

Usui, Masakazu; Mitsui, Teruaki; Fujii, Katsutoshi; Ono, Naonori; Niwa, Yoshinori

1998-04-01

85

A practical method to evaluate radiofrequency exposure of mast workers.  

PubMed

Assessment of occupational exposure to radiofrequency (RF) fields in telecommunication transmitter masts is a challenging task. For conventional field strength measurements using manually operated instruments, it is difficult to document the locations of measurements while climbing up a mast. Logging RF dosemeters worn by the workers, on the other hand, do not give any information about the location of the exposure. In this study, a practical method was developed and applied to assess mast workers' exposure to RF fields and the corresponding location. This method uses a logging dosemeter for personal RF exposure evaluation and two logging barometers to determine the corresponding height of the worker's position on the mast. The procedure is not intended to be used for compliance assessments, but to indicate locations where stricter assessments are needed. The applicability of the method is demonstrated by making measurements in a TV and radio transmitting mast. PMID:19054796

Alanko, Tommi; Hietanen, Maila

2008-01-01

86

[Mother Kangaroo Method: an investigation about the domestic practice].  

PubMed

This is a descriptive study, within uantitative approach, aiming at acquiring knowledge regarding the domestic practice of Mother Kangaroo Method. Data were collected from a survey for the parents of prematures hospitalized in a University Hospital in São Luís, Maranhão State, from May to August, 2005. According to the findings, 100% of the families received training and guidance in the hospital and only in 53.3% of the cases the mothers were guided. The benefits of the education works developed by the team were confirmed by domestic practice with 93.3% of the mothers performing the kangaroo position correctly, where 86.7% of the babies were slightly dressed, 86.7% of the mothers were breast feeding technically in a correct manner and 86.7% without any other items being used. 46.7% of the mothers stay 5 to 8 hours/day with their babies in this position and 66.7% identified house tasks as the principal obstacle of the practice. Regarding neonatal walk-in unit follow-up 63.3% of the mothers identified the lack of financial recourses to pay for transportation as the main difficulty factor. The data obtained show that the support from the family network and the health team seem to be the best way to guarantee the extension of domestic care. PMID:20169256

de Araújo, Cristiane Luciana; Rios, Cláudia Teresa Frias; dos Santos, Marinese Hermínia; Gonçalves, Anna Paula Ferrario

2010-01-01

87

Method of Analysis at the U.S. Geological Survey California Water Science Center, Sacramento Laboratory: Determination of Haloacetic Acid Formation Potential, Method Validation, and Quality-Control Practices.  

National Technical Information Service (NTIS)

An analytical method for the determination of haloacetic acid formation potential of water samples has been developed by the U.S. Geological Survey California Water Science Center, Sacramento Laboratory. The haloacetic acid formation potential is measured...

B. C. Zazzi K. L. Crepeau M. S. Fram B. A. Bergamaschi

2005-01-01

88

Seismic Assessment of Structures by a Practice-Oriented Method  

NASA Astrophysics Data System (ADS)

A relatively simple seismic analysis technique based on the pushover analysis of a multi-degree-of-freedom model and the response spectrum analysis of an equivalent single-degree-of-freedom system, called the N2 method, has been developed at the University of Ljubljana and implemented in the European standard Eurocode 8. The method is formulated in the acceleration —displacement format, which enables the visual interpretation of the procedure and of the relations between the basic quantities controlling the seismic response. Its basic variant was restricted to planar structures. Recently the applicability of the method has been extended to plan-asymmetric buildings, which require a 3D structural model. In the paper, the N2 method is summarized and applied to two test examples.

Fajfar, P.

89

Gait analysis methods in rehabilitation  

Microsoft Academic Search

INTRODUCTION: Brand's four reasons for clinical tests and his analysis of the characteristics of valid biomechanical tests for use in orthopaedics are taken as a basis for determining what methodologies are required for gait analysis in a clinical rehabilitation context. MEASUREMENT METHODS IN CLINICAL GAIT ANALYSIS: The state of the art of optical systems capable of measuring the positions of

Richard Baker; Hugh Williamson; Gait CCRE

2006-01-01

90

Comparative Lifecycle Energy Analysis: Theory and Practice.  

ERIC Educational Resources Information Center

Explores the position that more energy is conserved through recycling secondary materials than is generated from municipal solid waste incineration. Discusses one component of a lifecycle analysis--a comparison of energy requirements for manufacturing competing products. Includes methodological issues, energy cost estimates, and difficulties…

Morris, Jeffrey; Canzoneri, Diana

1992-01-01

91

Traditional Methods for Mineral Analysis  

NASA Astrophysics Data System (ADS)

This chapter describes traditional methods for analysis of minerals involving titrimetric and colorimetric procedures, and the use of ion selective electrodes. Other traditional methods of mineral analysis include gravimetric titration (i.e., insoluble forms of minerals are precipitated, rinse, dried, and weighed) and redox reactions (i.e., mineral is part of an oxidation-reduction reaction, and product is quantitated). However, these latter two methods will not be covered because they currently are used little in the food industry. The traditional methods that will be described have maintained widespread usage in the food industry despite the development of more modern instrumentation such as atomic absorption spectroscopy and inductively coupled plasma-atomic emission spectroscopy (Chap. 24). Traditional methods generally require chemicals and equipment that are routinely available in an analytical laboratory and are within the experience of most laboratory technicians. Additionally, traditional methods often form the basis for rapid analysis kits (e.g., Quantab®; for salt determination) that are increasingly in demand. Procedures for analysis of minerals of major nutritional or food processing concern are used for illustrative purposes. For additional examples of traditional methods refer to references (1-6). Slight modifications of these traditional methods are often needed for specific foodstuffs to minimize interferences or to be in the range of analytical performance. For analytical requirements for specific foods see the Official Methods of Analysis of AOAC International (5) and related official methods (6).

Ward, Robert E.; Carpenter, Charles E.

92

Practical Aspects of Krylov Subspace Iterative Methods in CFD  

NASA Technical Reports Server (NTRS)

Implementation issues associated with the application of Krylov subspace iterative methods, such as Newton-GMRES, are presented within the framework of practical computational fluid dynamic (CFD) applications. This paper categorizes, evaluates, and contrasts the major ingredients (function evaluations, matrix-vector products, and preconditioners) of Newton-GMRES Krylov subspace methods in terms of their effect on the local linear and global nonlinear convergence, memory requirements, and accuracy. The discussion focuses on Newton-GMRES in both a structured multi-zone incompressible Navier-Stokes solver and an unstructured mesh finite-volume Navier-Stokes solver. Approximate versus exact matrix-vector products, effective preconditioners, and other pertinent issues are addressed.

Pulliam, Thomas H.; Rogers, Stuart; Barth, Timothy

1996-01-01

93

An analysis of remanufacturing practices in Japan  

Microsoft Academic Search

Purpose  This study presents case studies of selected remanufacturing operations in Japan. It investigates Japanese companies' motives\\u000a and incentives for remanufacturing, clarifies the requirements and obstacles facing remanufacturers, itemizes what measures\\u000a companies take to address them, and discusses the influence of Japanese laws related to remanufacturing.\\u000a \\u000a \\u000a \\u000a \\u000a Methods  This study involves case studies of four product areas: photocopiers, single-use cameras, auto parts, and

Mitsutaka Matsumoto; Yasushi Umeda

2011-01-01

94

Assessing methods for measurement of clinical outcomes and quality of care in primary care practices  

PubMed Central

Purpose To evaluate the appropriateness of potential data sources for the population of performance indicators for primary care (PC) practices. Methods This project was a cross sectional study of 7 multidisciplinary primary care teams in Ontario, Canada. Practices were recruited and 5-7 physicians per practice agreed to participate in the study. Patients of participating physicians (20-30) were recruited sequentially as they presented to attend a visit. Data collection included patient, provider and practice surveys, chart abstraction and linkage to administrative data sets. Matched pairs analysis was used to examine the differences in the observed results for each indicator obtained using multiple data sources. Results Seven teams, 41 physicians, 94 associated staff and 998 patients were recruited. The survey response rate was 81% for patients, 93% for physicians and 83% for associated staff. Chart audits were successfully completed on all but 1 patient and linkage to administrative data was successful for all subjects. There were significant differences noted between the data collection methods for many measures. No single method of data collection was best for all outcomes. For most measures of technical quality of care chart audit was the most accurate method of data collection. Patient surveys were more accurate for immunizations, chronic disease advice/information dispensed, some general health promotion items and possibly for medication use. Administrative data appears useful for indicators including chronic disease diagnosis and osteoporosis/ breast screening. Conclusions Multiple data collection methods are required for a comprehensive assessment of performance in primary care practices. The choice of which methods are best for any one particular study or quality improvement initiative requires careful consideration of the biases that each method might introduce into the results. In this study, both patients and providers were willing to participate in and consent to, the collection and linkage of information from multiple sources that would be required for such assessments.

2012-01-01

95

SAR/QSAR methods in public health practice  

SciTech Connect

Methods of (Quantitative) Structure-Activity Relationship ((Q)SAR) modeling play an important and active role in ATSDR programs in support of the Agency mission to protect human populations from exposure to environmental contaminants. They are used for cross-chemical extrapolation to complement the traditional toxicological approach when chemical-specific information is unavailable. SAR and QSAR methods are used to investigate adverse health effects and exposure levels, bioavailability, and pharmacokinetic properties of hazardous chemical compounds. They are applied as a part of an integrated systematic approach in the development of Health Guidance Values (HGVs), such as ATSDR Minimal Risk Levels, which are used to protect populations exposed to toxic chemicals at hazardous waste sites. (Q)SAR analyses are incorporated into ATSDR documents (such as the toxicological profiles and chemical-specific health consultations) to support environmental health assessments, prioritization of environmental chemical hazards, and to improve study design, when filling the priority data needs (PDNs) as mandated by Congress, in instances when experimental information is insufficient. These cases are illustrated by several examples, which explain how ATSDR applies (Q)SAR methods in public health practice.

Demchuk, Eugene, E-mail: edemchuk@cdc.gov; Ruiz, Patricia; Chou, Selene; Fowler, Bruce A.

2011-07-15

96

Comparison of Hartmann analysis methods  

NASA Astrophysics Data System (ADS)

Analysis of Hartmann-Shack wavefront sensors for the eye is traditionally performed by locating and centroiding the sensor spots. These centroids provide the gradient, which is integrated to yield the ocular aberration. Fourier methods can replace the centroid stage, and Fourier integration can replace the direct integration. The two—demodulation and integration—can be combined to directly retrieve the wavefront, all in the Fourier domain. Now we applied this full Fourier analysis to circular apertures and real images. We performed a comparison between it and previous methods of convolution, interpolation, and Fourier demodulation. We also compared it with a centroid method, which yields the Zernike coefficients of the wavefront. The best performance was achieved for ocular pupils with a small boundary slope or far from the boundary and acceptable results for images missing part of the pupil. The other Fourier analysis methods had much higher tolerance to noncentrosymmetric apertures.

Canovas, Carmen; Ribak, Erez N.

2007-04-01

97

Practical evaluation of Mung bean seed pasteurization method in Japan.  

PubMed

The majority of the seed sprout-related outbreaks have been associated with Escherichia coli O157:H7 and Salmonella. Therefore, an effective method for inactivating these organisms on the seeds before sprouting is needed. The current pasteurization method for mung beans in Japan (hot water treatment at 85 degrees C for 10 s) was more effective for disinfecting inoculated E. coli O157:H7, Salmonella, and nonpathogenic E. coli on mung bean seeds than was the calcium hypochlorite treatment (20,000 ppm for 20 min) recommended by the U.S. Food and Drug Administration. Hot water treatment at 85 degrees C for 40 s followed by dipping in cold water for 30 s and soaking in chlorine water (2,000 ppm) for 2 h reduced the pathogens to undetectable levels, and no viable pathogens were found in a 25-g enrichment culture and during the sprouting process. Practical tests using a working pasteurization machine with nonpathogenic E. coli as a surrogate produced similar results. The harvest yield of the treated seed was within the acceptable range. These treatments could be a viable alternative to the presently recommended 20,000-ppm chlorine treatment for mung bean seeds. PMID:20377967

Bari, M L; Enomoto, K; Nei, D; Kawamoto, S

2010-04-01

98

Data analysis method for evaluating dialogic learning.  

PubMed

The purpose of this paper is to introduce a new method of analysing and evaluating dialogic learning. Dialogic learning offers possibilities that have not previously been found in nursing or nursing education, although some nursing researchers have lately become interested in dialogic nursing interaction between nurses and patients. The stages of analysis of dialogic learning have been illustrated by using an example. The data for this illustration were collected by video-taping a planning process where students for a Master's degree (qualifying them to be nursing instructors in Finland) plan, implement and evaluate a course for nursing students, on the care of terminally ill patients. However, it is possible to use this method of analysis for other dialogic learning situations both in nursing practice (for example, collaborative meetings between experts and patients) and in nursing education (for example, collaborative learning situations). The focus of this method of analysis concentrates on various situations where participants in interaction see the object of discussion from various points of view. This method of analysis helps the participants in the interaction to develop their interactional skills both through an awareness of their own views, and through understanding the other participants' various views in a particular nursing situation. PMID:11148833

Janhonen, S; Sarja, A

2000-02-01

99

Customer portfolio analysis practices in different exchange contexts  

Microsoft Academic Search

Customer relationship management is increasingly important in current marketing research and practice. The customer portfolio models represent one of the few concrete tools proposed for relationship management in business-to-business markets. Yet, knowledge of how companies use customer portfolio analysis (CPA) remains limited. Earlier research adopts a fairly narrow view of CPA and ignores the influence of internal and external company

Harri Terho; Aino Halinen

2007-01-01

100

Market Structure Analysis of Media Selection Practices by Travel Services  

Microsoft Academic Search

This paper examined media selection practices by tourism business competing in Alaska. Two media selection decisions, media use and media mix, were investigated. Media use decisions focused on the use\\/not use of television, national magazines, radio, newspapers, outdoor advertising, and a regional travel magazine. Media mix decisions explored what combinations of these six media firms utilized. A market structure analysis

David Snepenger; Mary Snepenger

1994-01-01

101

Practical methods for meeting remediation goals at hazardous waste sites.  

PubMed

Risk-based cleanup goals or preliminary remediation goals (PRGs) are established at hazardous waste sites when contaminant concentrations in air, soil, surface water, or groundwater exceed specified acceptable risk levels. When derived in accordance with the Environmental Protection Agency's risk assessment guidance, the PRG is intended to represent the average contaminant concentration within an exposure unit area that is left on the site following remediation. The PRG, however, frequently has been used inconsistently at Superfund sites with a number of remediation decisions using the PRG as a not-to-exceed concentration (NTEC). Such misapplications could result in overly conservative and unnecessarily costly remedial actions. The PRG should be applied in remedial actions in the same manner in which it was generated. Statistical methods, such as Bower's Confidence Response Goal, and mathematical methods such as "iterative removal of hot spots," are available to assist in the development of NTECs that ensure the average postremediation contaminant concentration is at or below the PRG. These NTECs can provide the risk manager with a more practical cleanup goal. In addition, an acute PRG can be developed to ensure that contaminant concentrations left on-site following remediation are not so high as to pose an acute or short-term health risk if excessive exposure to small areas of the site should occur. A case study demonstrates cost savings of five to ten times associated with the more scientifically sound use of the PRG as a postremediation site average, and development of a separate NTEC and acute PRG based on the methods referenced in this article. PMID:11332551

Schulz, T W; Griffin, S

2001-02-01

102

Diagnostic Methods for Bile Acid Malabsorption in Clinical Practice  

PubMed Central

Altered bile acid (BA) concentrations in the colon may cause diarrhea or constipation. BA malabsorption (BAM) accounts for >25% of patients with irritable bowel syndrome (IBS) with diarrhea and chronic diarrhea in Western countries. As BAM is increasingly recognized, proper diagnostic methods are desired in clinical practice to help direct the most effective treatment course for the chronic bowel dysfunction. This review appraises the methodology, advantages and disadvantages of 4 tools that directly measure BAM: 14C-glycocholate breath and stool test, 75Selenium HomotauroCholic Acid Test (SeHCAT), 7 ?-hydroxy-4-cholesten-3-one (C4) and fecal BAs. 14C-glycocholate is a laborious test no longer widely utilized. 75SeHCAT is validated, but not available in the United States. Serum C4 is a simple, accurate method that is applicable to a majority of patients, but requires further clinical validation. Fecal measurements to quantify total and individual fecal BAs are technically cumbersome and not widely available. Regrettably, none of these tests are routinely available in the U.S., and a therapeutic trial with a BA binder is used as a surrogate for diagnosis of BAM. Recent data suggest there is an advantage to studying fecal excretion of the individual BAs and their role in BAM; this may constitute a significant advantage of the fecal BA method over the other tests. Fecal BA test could become a routine addition to fecal fat measurement in patients with unexplained diarrhea. In summary, availability determines the choice of test among C4, SeHCAT and fecal BA; more widespread availability of such tests would enhance clinical management of these patients.

Vijayvargiya, Priya; Camilleri, Michael; Shin, Andrea; Saenger, Amy

2013-01-01

103

A practical method for determining organ dose during CT examination.  

PubMed

A practical method, based on depth dose, for determining organ dose during computed tomography (CT) examination is presented. For 4-slice spiral CT scans, performed at radii of 0, 37.5, 75.0, 112.5, and 150.0 mm, measurement of depth dose has been made using thermoluminescent dosimeters (TLDs) inserted into a modified International Electrotechnical Commission (IEC) standard dosimetry phantom and also additional TLDs placed on the surface of the phantom. A regression equation-linking dose with distance from the center of the phantom has been formulated, from which dose to a point of interest relative to the surface dose can also be calculated. The approximation reflects the attenuation properties of X-rays in the phantom. Using the equation, an estimate of organ dose can be ascertained for CT examination, assuming water equivalence of human tissue and a known organ position and volume. Using the 4-slice spiral scanner, relative doses to a patients' lung have been calculated, the location and size of the lung in vivo being found from the CT scan image, and the lung being divided into 38 segments to calculate the relative dose. Results from our test case show the dose to the lung to have been 69+/-13% of surface dose. PMID:16979343

Cheung, Tsang; Cheng, Qijun; Feng, Dinghua

2007-02-01

104

A Meta-Analysis of Published School Social Work Practice Studies: 1980-2007  

ERIC Educational Resources Information Center

Objective: This systematic review examined the effectiveness of school social work practices using meta-analytic techniques. Method: Hierarchical linear modeling software was used to calculate overall effect size estimates as well as test for between-study variability. Results: A total of 21 studies were included in the final analysis.…

Franklin, Cynthia; Kim, Johnny S.; Tripodi, Stephen J.

2009-01-01

105

Nursing documentation and nursing practice: a discourse analysis.  

PubMed

Nursing documentation exists as a daily reality of nurses' work. It is interpreted by some as the evidence of nursing actions and dismissed by others as a misrepresentation of nursing care. This paper reports on a study of nursing documentation as nursing practice. The work of Foucault and discourse analysis provide a research design for examination of how written descriptions of patient events taken from patient case notes result from hegemonic influences that construct a knowledge and therefore a practice of nursing. Discourses as ways of understanding knowledge as language, social practices and power relations are used to identify how nursing documentation functions as a manifestation and ritual of power relations. A focus on body work and fragmented bodies provided details of nursing's participation in the discursive construction of the object patient and invisible nurse. It is through resistances to documentation that alternative knowledge of nursing exists. PMID:8807383

Heartfield, M

1996-07-01

106

Method of photon spectral analysis  

DOEpatents

A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

Gehrke, Robert J. (Idaho Falls, ID); Putnam, Marie H. (Idaho Falls, ID); Killian, E. Wayne (Idaho Falls, ID); Helmer, Richard G. (Idaho Falls, ID); Kynaston, Ronnie L. (Blackfoot, ID); Goodwin, Scott G. (Idaho Falls, ID); Johnson, Larry O. (Pocatello, ID)

1993-01-01

107

Method of photon spectral analysis  

DOEpatents

A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

1993-04-27

108

Content Analysis as a Best Practice in Technical Communication Research  

ERIC Educational Resources Information Center

Content analysis is a powerful empirical method for analyzing text, a method that technical communicators can use on the job and in their research. Content analysis can expose hidden connections among concepts, reveal relationships among ideas that initially seem unconnected, and inform the decision-making processes associated with many technical…

Thayer, Alexander; Evans, Mary; McBride, Alicia; Queen, Matt; Spyridakis, Jan

2007-01-01

109

Practical Aspects of the Equation-Error Method for Aircraft Parameter Estimation  

NASA Technical Reports Server (NTRS)

Various practical aspects of the equation-error approach to aircraft parameter estimation were examined. The analysis was based on simulated flight data from an F-16 nonlinear simulation, with realistic noise sequences added to the computed aircraft responses. This approach exposes issues related to the parameter estimation techniques and results, because the true parameter values are known for simulation data. The issues studied include differentiating noisy time series, maximum likelihood parameter estimation, biases in equation-error parameter estimates, accurate computation of estimated parameter error bounds, comparisons of equation-error parameter estimates with output-error parameter estimates, analyzing data from multiple maneuvers, data collinearity, and frequency-domain methods.

Morelli, Eugene a.

2006-01-01

110

Empowerment in social work practice with the psychiatrically disabled: Model and method  

Microsoft Academic Search

Empowerment of those clients disempowered by society is explored as both a philosophy to inform social work practice and as a continuing method of practice in itself. A three?part model of empowerment at individual, interpersonal, and group levels is presented and is then illustrated by application to a case study of psychiatric disability. The generalist model of practice is seen

Arnold Kruger

2000-01-01

111

Methods used by Dental Practice-Based Research Network (DPBRN) dentists to diagnose dental caries  

PubMed Central

Objectives To (1) identify the methods that dentists in The Dental Practice-Based Research Network (DPBRN) use to diagnose dental caries; (2) quantify their frequency of use; and (3) test the hypothesis that certain dentist and dental practice characteristics are significantly associated with their use. Methods A questionnaire about methods used for caries diagnosis was sent to DPBRN dentists who reported doing at least some restorative dentistry; 522 dentists participated. Questions included use of dental radiographs, dental explorer, laser fluorescence, air-drying, fiber optic devices, and magnification, as used when diagnosing primary, secondary/recurrent, or non-specific caries lesions. Variations on the frequency of their use were tested using multivariate analysis and Bonferroni tests. Results Overall, the dental explorer was the instrument most commonly used to detect primary occlusal caries as well as to detect caries at the margins of existing restorations. In contrast, laser fluorescence was rarely used to help diagnose occlusal primary caries. For proximal caries, radiographs were used to help diagnose 75-100% of lesions by 96% of the DPBRN dentists. Dentists who use radiographs most often to assess proximal surfaces of posterior teeth, were significantly more likely to also report providing a higher percentage of patients with individualized caries prevention (p = .040) and seeing a higher percentage of pediatric patients (p = .001). Conclusion Use of specific diagnostic methods varied substantially. The dental explorer and radiographs are still the most commonly used diagnostic methods.

Gordan, Valeria V.; Riley, Joseph L; Carvalho, Ricardo M.; Snyder, John; Sanderson, James L; Anderson, Mary; Gilbert, Gregg H.

2010-01-01

112

A Mixed-Method Approach to Investigating the Adoption of Evidence-Based Pain Practices in Nursing Homes  

PubMed Central

This mixed methods study examined perceived facilitators and obstacles to adopting evidence-based pain management protocols vis-a-vis documented practice changes that were measured using a chart audit tool. This analysis used data from a subgroup of four nursing homes that participated in a clinical trial. Focus group interviews with staff yielded qualitative data about perceived factors that affected their willingness and ability to use the protocols. Chart audits determined whether pain assessment and management practices changed over time in light of these reported facilitators and barriers. Reported facilitators included administrative support, staff consistency, and policy and procedure changes. Barriers were staff attitudes, regulatory issues, and provider mistrust of nurses’ judgment. Overall, staff reported improvements in pain practices. These reports were corroborated by modest but significant increases in adherence to recommended practices. Change in clinical practice is complex and requires attention to both structural and process aspects of care.

Ersek, Mary; Jablonski, Anita

2014-01-01

113

Flow methods in chiral analysis.  

PubMed

The methods used for the separation and analytical determination of individual isomers are based on interactions with substances exhibiting optical activity. The currently used methods for the analysis of optically active compounds are primarily high-performance separation methods, such as gas and liquid chromatography using chiral stationary phases or chiral selectors in the mobile phase, and highly efficient electromigration techniques, such as capillary electrophoresis using chiral selectors. Chemical sensors and biosensors may also be designed for the analysis of optically active compounds. As enantiomers of the same compound are characterised by almost identical physico-chemical properties, their differentiation/separation in one-step unit operation in steady-state or dynamic flow systems requires the use of highly effective chiral selectors. Examples of such determinations are reviewed in this paper, based on 105 references. The greatest successes for isomer determination involve immunochemical interactions, enantioselectivity of the enzymatic biocatalytic processes, and interactions with ion-channel receptors or molecularly imprinted polymers. Conducting such processes under dynamic flow conditions may significantly enhance the differences in the kinetics of such processes, leading to greater differences in the signals recorded for enantiomers. Such determinations in flow conditions are effectively performed using surface-plasmon resonance and piezoelectric detections, as well as using common spectroscopic and electrochemical detections. PMID:24139575

Trojanowicz, Marek; Kaniewska, Marzena

2013-11-01

114

Coal Field Fire Fighting - Practiced methods, strategies and tactics  

NASA Astrophysics Data System (ADS)

Subsurface coal fires destroy millions of tons of coal each year, have an immense impact to the ecological surrounding and threaten further coal reservoirs. Due to enormous dimensions a coal seam fire can develop, high operational expenses are needed. As part of the Sino-German coal fire research initiative "Innovative technologies for exploration, extinction and monitoring of coal fires in Northern China" the research team of University of Wuppertal (BUW) focuses on fire extinction strategies and tactics as well as aspects of environmental and health safety. Besides the choice and the correct application of different extinction techniques further factors are essential for the successful extinction. Appropriate tactics, well trained and protected personnel and the choice of the best fitting extinguishing agents are necessary for the successful extinction of a coal seam fire. The chosen strategy for an extinction campaign is generally determined by urgency and importance. It may depend on national objectives and concepts of coal conservation, on environmental protection (e.g. commitment to green house gases (GHG) reductions), national funding and resources for fire fighting (e.g. personnel, infrastructure, vehicles, water pipelines); and computer-aided models and simulations of coal fire development from self ignition to extinction. In order to devise an optimal fire fighting strategy, "aims of protection" have to be defined in a first step. These may be: - directly affected coal seams; - neighboring seams and coalfields; - GHG emissions into the atmosphere; - Returns on investments (costs of fire fighting compared to value of saved coal). In a further step, it is imperative to decide whether the budget shall define the results, or the results define the budget; i.e. whether there are fixed objectives for the mission that will dictate the overall budget, or whether the limited resources available shall set the scope within which the best possible results shall be achieved. For an effective and efficient fire fighting optimal tactics are requiered and can be divided into four fundamental tactics to control fire hazards: - Defense (digging away the coal, so that the coal can not begin to burn; or forming a barrier, so that the fire can not reach the not burning coal), - Rescue the coal (coal mining of a not burning seam), - Attack (active and direct cooling of burning seam), - Retreat (only monitoring till self-extinction of a burning seam). The last one is used when a fire exceeds the organizational and/or technical scope of a mission. In other words, "to control a coal fire" does not automatically and in all situations mean "to extinguish a coal fire". Best-practice tactics or a combination of them can be selected for control of a particular coal fire. For the extinguishing works different extinguishing agents are available. They can be applied by different application techniques and varying distinctive operating expenses. One application method may be the drilling of boreholes from the surface or covering the surface with low permeability soils. The mainly used extinction agents for coal field fire are as followed: Water (with or without additives), Slurry, Foaming mud/slurry, Inert gases, Dry chemicals and materials and Cryogenic agents. Because of its tremendous dimension and its complexity the worldwide challenge of coal fires is absolutely unique - it can only be solved with functional application methods, best fitting strategies and tactics, organisation and research as well as the dedication of the involved fire fighters, who work under extreme individual risks on the burning coal fields.

Wündrich, T.; Korten, A. A.; Barth, U. H.

2009-04-01

115

Parallel Processable Cryptographic Methods with Unbounded Practical Security.  

ERIC Educational Resources Information Center

Addressing the problem of protecting confidential information and data stored in computer databases from access by unauthorized parties, this paper details coding schemes which present such astronomical work factors to potential code breakers that security breaches are hopeless in any practical sense. Two procedures which can be used to encode for…

Rothstein, Jerome

116

Teaching the Best Practice Way: Methods That Matter, K-12  

ERIC Educational Resources Information Center

Everyone talks about "best practice" teaching--what does it actually look like in the classroom? How do working teachers translate complex curriculum standards into simple, workable classroom structures that embody exemplary instruction--and still let kids find joy in learning? In this book, the authors present seven basic teaching structures that…

Daniels, Harvey; Bizar, Marilyn

2004-01-01

117

Learning by the Case Method: Practical Approaches for Community Leaders.  

ERIC Educational Resources Information Center

This supplement to Volunteer Training and Development: A Manual for Community Groups, provides practical guidance in the selection, writing, and adaptation of effective case materials for specific educational objectives, and develops suitable cases for use by analyzing concrete situations and by offering illustrations of various types. An…

Stenzel, Anne K.; Feeney, Helen M.

118

The practical implementation of integrated safety management for nuclear safety analysis and fire hazards analysis documentation  

SciTech Connect

In 1995 Mr. Joseph DiNunno of the Defense Nuclear Facilities Safety Board issued an approach to describe the concept of an integrated safety management program which incorporates hazard and safety analysis to address a multitude of hazards affecting the public, worker, property, and the environment. Since then the U S . Department of Energy (DOE) has adopted a policy to systematically integrate safety into management and work practices at all levels so that missions can be completed while protecting the public, worker, and the environment. While the DOE and its contractors possessed a variety of processes for analyzing fire hazards at a facility, activity, and job; the outcome and assumptions of these processes have not always been consistent for similar types of hazards within the safety analysis and the fire hazard analysis. Although the safety analysis and the fire hazard analysis are driven by different DOE Orders and requirements, these analyses should not be entirely independent and their preparation should be integrated to ensure consistency of assumptions, consequences, design considerations, and other controls. Under the DOE policy to implement an integrated safety management system, identification of hazards must be evaluated and agreed upon to ensure that the public. the workers. and the environment are protected from adverse consequences. The DOE program and contractor management need a uniform, up-to-date reference with which to plan. budget, and manage nuclear programs. It is crucial that DOE understand the hazards and risks necessarily to authorize the work needed to be performed. If integrated safety management is not incorporated into the preparation of the safety analysis and the fire hazard analysis, inconsistencies between assumptions, consequences, design considerations, and controls may occur that affect safety. Furthermore, confusion created by inconsistencies may occur in the DOE process to grant authorization of the work. In accordance with the integrated safety management system approach for having a uniform and consistent process: a method has been suggested by the U S . Department of Energy at Richland and the Project Hanford Procedures when fire hazard analyses and safety analyses are required. This process provides for a common basis approach in the development of the fire hazard analysis and the safety analysis. This process permits the preparers of both documents to jointly participate in the development of the hazard analysis process. This paper presents this method to implement the integrated safety management approach in the development of the fire hazard analysis and safety analysis that provides consistency of assumptions. consequences, design considerations, and other controls necessarily to protect workers, the public. and the environment.

COLLOPY, M.T.

1999-05-04

119

Voltammetric analysis apparatus and method  

DOEpatents

An apparatus and method is described for electrochemical analysis of elements in solution. An auxiliary electrode, a reference electrode, and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and the auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between the auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in the sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

Almon, A.C.

1993-06-08

120

A comparative analysis of the use of work-life balance practices in Europe : Do practices enhance females’ career advancement?  

Microsoft Academic Search

Purpose – The objectives of this study are: to identify and compare companies' involvement with work-life balance practices and policies in 14 European countries, and to test whether these practices actually enhance the career advancement of women to senior management positions. Design\\/methodology\\/approach – A comparative descriptive analysis shows differences in work-life balance practices and policies and women's participation in the

Caroline Straub

2007-01-01

121

Practical design methods for barrier pillars. Information circular/1995  

SciTech Connect

Effective barrier pillar design is essential for safe and productive underground coal mining. This U.S. Bureau of Mines report presents an overview of available barrier pillar design methodologies that incorporate sound engineering principles while remaining practical for everyday usage. Nomographs and examples are presented to assist in the determination of proper barrier pillar sizing. Additionally, performance evaluation techniques and criteria are included to assist in determining the effectiveness of selected barrier pillar configurations.

Koehler, J.R.; Tadolini, S.C.

1995-11-01

122

Canonical Correlation Analysis: An Explanation with Comments on Correct Practice.  

ERIC Educational Resources Information Center

This paper briefly explains the logic underlying the basic calculations employed in canonical correlation analysis. A small hypothetical data set is employed to illustrate that canonical correlation analysis subsumes both univariate and multivariate parametric methods. Several real data sets are employed to illustrate other themes. Three common…

Thompson, Bruce

123

Cluster analysis in community research: Epistemology and practice  

Microsoft Academic Search

Cluster analysis refers to a family of methods for identifying cases with distinctive characteristics in heterogeneous samples and combining them into homogeneous groups. This approach provides a great deal of information about the types of cases and the distributions of variables in a sample. This paper considers cluster analysis as a quantitative complement to the traditional linear statistics that often

Bruce D. Rapkin; Douglas A. Luke

1993-01-01

124

Analysis of flight equipment purchasing practices of representative air carriers  

NASA Technical Reports Server (NTRS)

The process through which representative air carriers decide whether or not to purchase flight equipment was investigated as well as their practices and policies in retiring surplus aircraft. An analysis of the flight equipment investment decision process in ten airlines shows that for the airline industry as a whole, the flight equipment investment decision is in a state of transition from a wholly informal process in earliest years to a much more organized and structured process in the future. Individual air carriers are in different stages with respect to the formality and sophistication associated with the flight equipment investment decision.

1977-01-01

125

The development of a 3D risk analysis method  

Microsoft Academic Search

Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk

Yet-Pole I; Te-Lung Cheng

2008-01-01

126

A Practical Guide to Interpretation of Large Collections of Incident Narratives Using the QUORUM Method  

NASA Technical Reports Server (NTRS)

Analysis of incident reports plays an important role in aviation safety. Typically, a narrative description, written by a participant, is a central part of an incident report. Because there are so many reports, and the narratives contain so much detail, it can be difficult to efficiently and effectively recognize patterns among them. Recognizing and addressing recurring problems, however, is vital to continuing safety in commercial aviation operations. A practical way to interpret large collections of incident narratives is to apply the QUORUM method of text analysis, modeling, and relevance ranking. In this paper, QUORUM text analysis and modeling are surveyed, and QUORUM relevance ranking is described in detail with many examples. The examples are based on several large collections of reports from the Aviation Safety Reporting System (ASRS) database, and a collection of news stories describing the disaster of TWA Flight 800, the Boeing 747 which exploded in mid- air and crashed near Long Island, New York, on July 17, 1996. Reader familiarity with this disaster should make the relevance-ranking examples more understandable. The ASRS examples illustrate the practical application of QUORUM relevance ranking.

McGreevy, Michael W.

1997-01-01

127

Methods for Cancer Epigenome Analysis  

PubMed Central

Accurate detection of epimutations in tumor cells is crucial for understanding the molecular pathogenesis of cancer. Alterations in DNA methylation in cancer are functionally important and clinically relevant, but even this well-studied area is continually re-evaluated in light of unanticipated results, including a strong connection between aberrant DNA methylation in adult tumors and polycomb group profiles in embryonic stem cells, cancer-associated genetic mutations in epigenetic regulators such as DNMT3A and TET family genes, and the discovery of abundant 5-hydroxymethylcytosine, a product of TET proteins acting on 5-methylcytosine, in human tissues. The abundance and distribution of covalent histone modifications in primary cancer tissues relative to normal cells is a largely uncharted area, although there is good evidence for a mechanistic role of cancer-specific alterations in epigenetic marks in tumor etiology, drug response and tumor progression. Meanwhile, the discovery of new epigenetic marks continues, and there are many useful methods for epigenome analysis applicable to primary tumor samples, in addition to cancer cell lines. For DNA methylation and hydroxymethylation, next-generation sequencing allows increasingly inexpensive and quantitative whole-genome profiling. Similarly, the refinement and maturation of chromatin immunoprecipitation with next-generation sequencing (ChIP-seq) has made possible genome-wide mapping of histone modifications, open chromatin and transcription factor binding sites. Computational tools have been developed apace with these epigenome methods to better enable the accuracy and interpretation of the data from the profiling methods.

Nagarajan, Raman P.; Fouse, Shaun D.; Bell, Robert J.A.; Costello, Joseph F.

2014-01-01

128

Assessing Student Perception of Practice Evaluation Knowledge in Introductory Research Methods  

ERIC Educational Resources Information Center

The authors explored the use of the Practice Evaluation Knowledge Scale (PEKS) to assess student perception of acquisition and retention of practice evaluation knowledge from an undergraduate research methods class. The authors sampled 2 semesters of undergraduate social work students enrolled in an introductory research methods course.…

Baker, Lisa R.; Pollio, David E.; Hudson, Ashley

2011-01-01

129

Estimating free-living human energy expenditure: Practical aspects of the doubly labeled water method and its applications.  

PubMed

The accuracy and noninvasive nature of the doubly labeled water (DLW) method makes it ideal for the study of human energy metabolism in free-living conditions. However, the DLW method is not always practical in many developing and Asian countries because of the high costs of isotopes and equipment for isotope analysis as well as the expertise required for analysis. This review provides information about the theoretical background and practical aspects of the DLW method, including optimal dose, basic protocols of two- and multiple-point approaches, experimental procedures, and isotopic analysis. We also introduce applications of DLW data, such as determining the equations of estimated energy requirement and validation studies of energy intake. PMID:24944767

Park, Jonghoon; Kazuko, Ishikawa-Takata; Kim, Eunkyung; Kim, Jeonghyun; Yoon, Jinsook

2014-06-01

130

Estimating free-living human energy expenditure: Practical aspects of the doubly labeled water method and its applications  

PubMed Central

The accuracy and noninvasive nature of the doubly labeled water (DLW) method makes it ideal for the study of human energy metabolism in free-living conditions. However, the DLW method is not always practical in many developing and Asian countries because of the high costs of isotopes and equipment for isotope analysis as well as the expertise required for analysis. This review provides information about the theoretical background and practical aspects of the DLW method, including optimal dose, basic protocols of two- and multiple-point approaches, experimental procedures, and isotopic analysis. We also introduce applications of DLW data, such as determining the equations of estimated energy requirement and validation studies of energy intake.

Kazuko, Ishikawa-Takata; Kim, Eunkyung; Kim, Jeonghyun; Yoon, Jinsook

2014-01-01

131

Computational methods for global/local analysis  

NASA Technical Reports Server (NTRS)

Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

1992-01-01

132

Articulating current service development practices: a qualitative analysis of eleven mental health projects  

PubMed Central

Background The utilisation of good design practices in the development of complex health services is essential to improving quality. Healthcare organisations, however, are often seriously out of step with modern design thinking and practice. As a starting point to encourage the uptake of good design practices, it is important to understand the context of their intended use. This study aims to do that by articulating current health service development practices. Methods Eleven service development projects carried out in a large mental health service were investigated through in-depth interviews with six operation managers. The critical decision method in conjunction with diagrammatic elicitation was used to capture descriptions of these projects. Stage-gate design models were then formed to visually articulate, classify and characterise different service development practices. Results Projects were grouped into three categories according to design process patterns: new service introduction and service integration; service improvement; service closure. Three common design stages: problem exploration, idea generation and solution evaluation - were then compared across the design process patterns. Consistent across projects were a top-down, policy-driven approach to exploration, underexploited idea generation and implementation-based evaluation. Conclusions This study provides insight into where and how good design practices can contribute to the improvement of current service development practices. Specifically, the following suggestions for future service development practices are made: genuine user needs analysis for exploration; divergent thinking and innovative culture for idea generation; and fail-safe evaluation prior to implementation. Better training for managers through partnership working with design experts and researchers could be beneficial.

2014-01-01

133

A practical method for solving large-scale TRS  

Microsoft Academic Search

We present a nearly-exact method for the large scale trust region subproblem (TRS) based on the properties of the minimal-memory\\u000a BFGS method. Our study is concentrated in the case where the initial BFGS matrix can be any scaled identity matrix. The proposed\\u000a method is a variant of the Moré–Sorensen method that exploits the eigenstructure of the approximate Hessian B, and

M. S. Apostolopoulou; D. G. Sotiropoulos; C. A. Botsaris; Panayiotis E. Pintelas

2011-01-01

134

Flow analysis system and method  

NASA Technical Reports Server (NTRS)

A non-invasive flow analysis system and method wherein a sensor, such as an acoustic sensor, is coupled to a conduit for transmitting a signal which varies depending on the characteristics of the flow in the conduit. The signal is amplified and there is a filter, responsive to the sensor signal, and tuned to pass a narrow band of frequencies proximate the resonant frequency of the sensor. A demodulator generates an amplitude envelope of the filtered signal and a number of flow indicator quantities are calculated based on variations in amplitude of the amplitude envelope. A neural network, or its equivalent, is then used to determine the flow rate of the flow in the conduit based on the flow indicator quantities.

Hill, Wayne S. (Inventor); Barck, Bruce N. (Inventor)

1998-01-01

135

Propel: Tools and Methods for Practical Source Code Model Checking  

NASA Technical Reports Server (NTRS)

The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem

2003-01-01

136

A deliberate practice approach to teaching phylogenetic analysis.  

PubMed

One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

Hobbs, F Collin; Johnson, Daniel J; Kearns, Katherine D

2013-01-01

137

A Deliberate Practice Approach to Teaching Phylogenetic Analysis  

PubMed Central

One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or “one-shot,” in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts.

Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

2013-01-01

138

A Reliable Indexing Method for a Practical QA System  

Microsoft Academic Search

We propose a fast and reliable Question-answering (QA) system in Korean, which uses a predictive answer indexer based on 2-pass s coring method. The indexing process is as follows. The predictive a nswer indexer first extracts all answer candidates in a document. Then, using 2 -pass scoring method, it gives scores to the a djacent content words that are closely

Harksoo Kim; Jungyun Seo

139

A practical method for the numerical evaluation of Sommerfeld integrals  

Microsoft Academic Search

Sommerfeld integrals, which appear frequently in dipole radiation problems, often must be evaluated numerically. Gauss-Laguerre quadrature is an effective integration method, provided the horizontal distancerhofrom the source to the receiver is less than their vertical separationzeta. A complementary method is to use Romberg adaptive quadrature to integrate over the positive and negative half-cycles of the integrand, then from a sequence

D. Bubenik; DAVID M. BUBENLK

1977-01-01

140

A practical guide for analysis of nanoindentation data.  

PubMed

Mechanical properties of biological materials are increasingly explored via nanoindentation testing. This paper reviews the modes of deformation found during indentation: elastic, plastic, viscous and fracture. A scheme is provided for ascertaining which deformation modes are active during a particular indentation test based on the load-displacement trace. Two behavior maps for indentation are presented, one in the viscous-elastic-plastic space, concerning homogeneous deformation, and one in the plastic versus brittle space, concerning the transition to fracture behavior when the threshold for cracking is exceeded. Best-practice methods for characterizing materials are presented based on which deformation modes are active; the discussion includes both nanoindentation experimental test options and appropriate methods for analyzing the resulting data. PMID:19627846

Oyen, Michelle L; Cook, Robert F

2009-08-01

141

Methods of Multivariate Commonality Analysis.  

ERIC Educational Resources Information Center

Advantages of the use of multivariate commonality analysis are discussed and a small data set is used to illustrate the analysis and as a model to enable readers to conduct such an analysis. A noteworthy advantage of commonality analysis is that commonality honors the relationships among variables by determining the degree to which predictors in a…

Campbell, Kathleen T.

142

Clarifications of the BCU method for transient stability analysis  

Microsoft Academic Search

Energy function methods have been studied for many years, and have been applied to practical power system stability analysis problems of multi-machine power systems. Developments in real-time power system monitoring suggest that dynamic events can be monitored at the power system control centers, and naturally the energy function methods were tried as real-time stability prediction tools. However, a number of

A. Llamas; J. De La Ree Lopez; L. Mili; A. G. Phadke; J. S. Thorp

1995-01-01

143

General practice-based clinical trials in Germany - a problem analysis  

PubMed Central

Background In Germany, clinical trials and comparative effectiveness studies in primary care are still very rare, while their usefulness has been recognised in many other countries. A network of researchers from German academic general practice has explored the reasons for this discrepancy. Methods Based on a comprehensive literature review and expert group discussions, problem analyses as well as structural and procedural prerequisites for a better implementation of clinical trials in German primary care are presented. Results In Germany, basic biomedical science and technology is more reputed than clinical or health services research. Clinical trials are funded by industry or a single national programme, which is highly competitive, specialist-dominated, exclusive of pilot studies, and usually favours innovation rather than comparative effectiveness studies. Academic general practice is still not fully implemented, and existing departments are small. Most general practitioners (GPs) work in a market-based, competitive setting of small private practices, with a high case load. They have no protected time or funding for research, and mostly no research training or experience. Good Clinical Practice (GCP) training is compulsory for participation in clinical trials. The group defined three work packages to be addressed regarding clinical trials in German general practice: (1) problem analysis, and definition of (2) structural prerequisites and (3) procedural prerequisites. Structural prerequisites comprise specific support facilities for general practice-based research networks that could provide practices with a point of contact. Procedural prerequisites consist, for example, of a summary of specific relevant key measures, for example on a web platform. The platform should contain standard operating procedures (SOPs), templates, checklists and other supporting materials for researchers. Conclusion All in all, our problem analyses revealed that a substantial number of barriers contribute to the low implementation of clinical research in German general practice. Some issues are deeply rooted in Germany’s market-based healthcare and academic systems and traditions. However, new developments may facilitate change: recent developments in the German research landscape are encouraging.

2012-01-01

144

General practice fundholding: observations on prescribing patterns and costs using the defined daily dose method  

Microsoft Academic Search

OBJECTIVE--To compare prescribing patterns between a group of fundholding practices and a group of non-fundholding practices in north east Scotland using a method which provides more accurate statements about volumes prescribed than standard NHS statistics. DESIGN--The pharmacy practice division of the National Health Service in Scotland provided data for selected British National Formulary sections over two years. Each prescription issued

M Maxwell; D Heaney; J G Howie; S Noble

1993-01-01

145

Grounded action research: a method for understanding IT in practice  

Microsoft Academic Search

This paper shows how the theory development portion of action research can be made more rigorous. The process of theory formulation is an essential part of action research, yet this process is not well understood. A case study demonstrates how units of analysis and techniques from grounded theory can be integrated into the action research cycle in order to add

Richard Baskerville; Jan Pries-Heje

1999-01-01

146

A nationwide analysis of successful litigation claims in neurological practice  

PubMed Central

Objectives Neurological practice has previously been highlighted as a high-risk speciality with regard to malpractice claims. We set out to study the nature of these claims in order to inform physicians about hazardous areas within their speciality and potentially alter clinical practice. Design Nationwide retrospective analysis of successful neurology and neurosurgery claims over a 17-year period. Setting We studied all successful claims occurring between 1995 and 2012 using the NHS Litigation Authority database, which collects data on claims made against clinicians practising in England and Wales. Participants Four hundred and twenty-three successful claims were identified during the study period. Main outcome measures The errors involved, the patient groups affected, the resulting mortality and the litigation payments. Results 63.1% of claims were due to negligence in neurosurgical care, whilst 36.9% were due to negligence in neurological care. Litigation payments were significantly higher in neurosurgery compared to neurology cases. Diagnostic error was the most common cause of litigation. The disease categories with the highest numbers of successful litigation claims were spinal pathology, cerebrovascular disease including subarachnoid haemorrhage, intracranial tumours, hydrocephalus and neuropathy/neuromuscular disease. Conclusions This is the first study of successful litigation claims against the NHS for negligent neurological or neurosurgical care and provides data to help reduce risk and improve patient safety.

Coysh, Thomas; Breen, David P

2014-01-01

147

Adapting community based participatory research (CBPR) methods to the implementation of an asthma shared decision making intervention in ambulatory practices  

PubMed Central

Objective Translating research findings into clinical practice is a major challenge to improve the quality of healthcare delivery. Shared decision making (SDM) has been shown to be effective and has not yet been widely adopted by health providers. This paper describes the participatory approach used to adapt and implement an evidence-based asthma SDM intervention into primary care practices. Methods A participatory research approach was initiated through partnership development between practice staff and researchers. The collaborative team worked together to adapt and implement a SDM toolkit. Using the RE-AIM framework and qualitative analysis, we evaluated both the implementation of the intervention into clinical practice, and the level of partnership that was established. Analysis included the number of adopting clinics and providers, the patients’ perception of the SDM approach, and the number of clinics willing to sustain the intervention delivery after 1 year. Results All six clinics and physician champions implemented the intervention using half-day dedicated asthma clinics while 16% of all providers within the practices have participated in the intervention. Themes from the focus groups included the importance of being part the development process, belief that the intervention would benefit patients, and concerns around sustainability and productivity. One year after initiation, 100% of clinics have sustained the intervention, and 90% of participating patients reported a shared decision experience. Conclusions Use of a participatory research process was central to the successful implementation of a SDM intervention in multiple practices with diverse patient populations.

Kuhn, Lindsay; Alkhazraji, Thamara; Steuerwald, Mark; Ludden, Tom; Wilson, Sandra; Mowrer, Lauren; Mohanan, Sveta; Dulin, Michael F.

2014-01-01

148

A mixed methods exploration of the team and organizational factors that may predict new graduate nurse engagement in collaborative practice.  

PubMed

Although engagement in collaborative practice is reported to support the role transition and retention of new graduate (NG) nurses, it is not known how to promote collaborative practice among these nurses. This mixed methods study explored the team and organizational factors that may predict NG nurse engagement in collaborative practice. A total of 514 NG nurses from Ontario, Canada completed the Collaborative Practice Assessment Tool. Sixteen NG nurses participated in follow-up interviews. The team and organizational predictors of NG engagement in collaborative practice were as follows: satisfaction with the team (??=?0.278; p?=?0.000), number of team strategies (??=?0.338; p?=?0.000), participation in a mentorship or preceptorship experience (??=?0.137; p?=?0.000), accessibility of manager (??=?0.123; p?=?0.001), and accessibility and proximity of educator or professional practice leader (??=?0.126; p?=?0.001 and ??=?0.121; p?=?0.002, respectively). Qualitative analysis revealed the team facilitators to be respect, team support and face-to-face interprofessional interactions. Organizational facilitators included supportive leadership, participation in a preceptorship or mentorship experience and time. Interventions designed to facilitate NG engagement in collaborative practice should consider these factors. PMID:24195680

Pfaff, Kathryn A; Baxter, Pamela E; Ploeg, Jenny; Jack, Susan M

2014-03-01

149

Developing a practical toxicogenomics data analysis system utilizing open-source software.  

PubMed

Comprehensive gene expression analysis has been applied to investigate the molecular mechanism of toxicity, which is generally known as toxicogenomics (TGx). When analyzing large-scale gene expression data obtained by microarray analysis, typical multivariate data analysis methods performed with commercial software such as hierarchical clustering or principal component analysis usually do not provide conclusive outputs by themselves. To best utilize the TGx data for toxicity evaluation in the drug development process, fit-for-purpose customization of the analytical algorithm with user-friendly interface and intuitive outputs are required to practically address the toxicologists' demands. However, commercial software is usually not very flexible in the customization of their functions or outputs. Owing to the recent advancement and accumulation of open-source software contributed by bioinformaticians all over the world, it becomes easier for us to develop practical and fit-for-purpose analytical software by ourselves with fairly low cost and efforts. The aim of this article is to present an example of developing an automated TGx data processing system (ATP system), which implements gene set-level analysis toxicogenomic profiling by D-score method and generates straightforward output that makes it easy to interpret the biological and toxicological significance of the TGx data. Our example will provide basic clues for readers to develop and customize their own TGx data analysis system which complements the function of existing commercial software. PMID:23086850

Hirai, Takehiro; Kiyosawa, Naoki

2013-01-01

150

Professional Suitability for Social Work Practice: A Factor Analysis  

ERIC Educational Resources Information Center

Objective: The purpose of this study was to identify the underlying dimensions of professional suitability. Method: Data were collected from a province-wide mail-out questionnaire surveying 341 participants from a random sample of registered social workers. Results: The use of an exploratory factor analysis identified a 5-factor solution on…

Tam, Dora M. Y.; Coleman, Heather; Boey, Kam-Wing

2012-01-01

151

How Test Organizations Adopt New Testing Practices and Methods?  

Microsoft Academic Search

Software testing process is an activity, in which the software is verified to comply with the requirements and validated to operate as intended. As software development adopts new development methods, this means also that the test processes need to be changed. In this qualitative study, we observe ten software organizations to understand how organizations develop their test processes and how

Jussi Kasurinen; Ossi Taipale; Kari Smolander

2011-01-01

152

Practical method of diffusion-welding steel plate in air  

NASA Technical Reports Server (NTRS)

Method is ideal for critical service requirements where parent metal properties are equaled in notch toughness, stress rupture and other characteristics. Welding technique variations may be used on a variety of materials, such as carbon steels, alloy steels, stainless steels, ceramics, and reactive and refractory materials.

Holko, K. H.; Moore, T. J.

1971-01-01

153

A practical equivalence method of large scale wind farm  

Microsoft Academic Search

Large scale wind power integration is the main way of wind power development in China. In simulation research, a large scale wind farm is necessary equivalent to be one aggregated model. The existing equivalence methods only consider the external characteristics of wind farm, and ignored the coupling relationship between units and power system. Through a massive simulation at different wind

Xin Tuo; Shen Hong; Bao Hai; Huang Bin; Chen Dezhi; Ding Jian

2010-01-01

154

Methods in Educational Research: From Theory to Practice  

ERIC Educational Resources Information Center

Written for students, educators, and researchers, "Methods in Educational Research" offers a refreshing introduction to the principles of educational research. Designed for the real world of educational research, the book's approach focuses on the types of problems likely to be encountered in professional experiences. Reflecting the importance of…

Lodico, Marguerite G.; Spaulding Dean T.; Voegtle, Katherine H.

2006-01-01

155

Preparing Special Education Teacher Candidates: Extending Case Method to Practice  

ERIC Educational Resources Information Center

Case methodology is receiving more recognition in the field of education as a viable pedagogy for use in the preparation of future educators. In this article, the coauthors explore two examples of case method instruction that extend beyond university classrooms to field sites: case report and case study. Both examples were used in special…

Lengyel, Linda; Vernon-Dotson, Lisa

2010-01-01

156

Theoretical and Practical Aspects of the Vibroseis (Trade Name) Method.  

National Technical Information Service (NTIS)

In the exploration of oil and gas using the seismic method, seismic vibrators are frequently used to generate a source signal. In the thesis, the wavefield emitted by one or more seismic vibrators acting at the surface of the earth is investigated. In the...

G. J. M. Baeten

1989-01-01

157

AUTOETHNOGRAPHY AS A METHOD FOR REFLEXIVE RESEARCH AND PRACTICE IN VOCATIONAL PSYCHOLOGY  

Microsoft Academic Search

This paper overviews the qualitative research method autoethnography and its relevance to research in vocational psychology and practice in career development. Autoethnography is a reflexive means by which the researcher-practitioner consciously embeds himself or herself amidst theory and practice, and by way of intimate autobiographic account, explicates a phenomenon under investigation or intervention. Autoethnography is presented as a vehicle to

PETER MCILVEEN

158

Perceived Barriers and Facilitators to School Social Work Practice: A Mixed-Methods Study  

ERIC Educational Resources Information Center

Understanding barriers to practice is a growing area within school social work research. Using a convenience sample of 284 school social workers, this study replicates the efforts of a mixed-method investigation designed to identify barriers and facilitators to school social work practice within different geographic locations. Time constraints and…

Teasley, Martell; Canifield, James P.; Archuleta, Adrian J.; Crutchfield, Jandel; Chavis, Annie McCullough

2012-01-01

159

Polydispersity analysis of taylor dispersion data: the cumulant method.  

PubMed

Taylor dispersion analysis is an increasingly popular characterization method that measures the diffusion coefficient, and hence the hydrodynamic radius, of (bio)polymers, nanoparticles, or even small molecules. In this work, we describe an extension to current data analysis schemes that allows size polydispersity to be quantified for an arbitrary sample, thereby significantly enhancing the potentiality of Taylor dispersion analysis. The method is based on a cumulant development similar to that used for the analysis of dynamic light scattering data. Specific challenges posed by the cumulant analysis of Taylor dispersion data are discussed, and practical ways to address them are proposed. We successfully test this new method by analyzing both simulated and experimental data for solutions of moderately polydisperse polymers and polymer mixtures. PMID:24937011

Cipelletti, Luca; Biron, Jean-Philippe; Martin, Michel; Cottet, Hervé

2014-07-01

160

An Effective and Practical Method for Solving Hydro-Thermal Unit Commitment Problems Based on Lagrangian Relaxation Method  

NASA Astrophysics Data System (ADS)

This paper presents an effective and practical method based on the Lagrangian relaxation method for solving hydro-thermal unit commitment problem in which operational constraints involve spinning reserve requirements for thermal units and prohibition of simultaneous unit start-up/shut-down at the same plant. This method is processed in each iteration step of LRM that enables a direct solution. To improve convergence, this method applies an augmented Lagrangian relaxation method. Its effectiveness demonstrated for a real power system.

Sakurai, Takayoshi; Kusano, Takashi; Saito, Yutaka; Hirato, Kota; Kato, Masakazu; Murai, Masahiko; Nagata, Junichi

161

[Disposal of drugs: an analysis of the practices in the family health program].  

PubMed

The scope of this article is to discuss the perception of health workers in relation to the disposal of drugs and analyze how this practice occurs in family health units in a city in the state of Bahia. It involved a qualitative and exploratory study together with nurses, nursing assistants, community health workers and pharmacists of Pharmaceutical Care and Health Surveillance. Semi-structured interviews were conducted with systematic observation and use of previously-drafted scripts and the content analysis method was used for data analysis. The results showed poor understanding regarding proper disposal among the workers, dissonant practices in the implementation of the regulations and a lack of communication between health surveillance and other health services. The creation of effective strategies must involve the whole process from management to the prescription and use of drugs and requires further political, economic and social participation. PMID:25014295

Alencar, Tatiane de Oliveira Silva; Machado, Carla Silva Rocha; Costa, Sônia Carine Cova; Alencar, Bruno Rodrigues

2014-07-01

162

[Methods for detection of biofilm formation in routine microbiological practice].  

PubMed

The increasing use of catheters, artificial implants and antimicrobials as well as high numbers of immunocompromised patients are major causes for concern over biofilm infections. These infections are characterized particularly by high resistance to antimicrobials and formation of persistent foci that may complicate therapy. Therefore, detection of biofilm formation is of high relevance to the clinician and his/her approach to the treatment. Reliable and sensitive methods for detection of this pathogenicity factor in clinically important organisms, suitable for use in routine microbiological laboratories, are needed for this purpose. Currently, a wide array of techniques are available for detection of this virulence factor, such as biofilm visualization by microscopy, culture detection, detection of particular components, detection of physical and chemical differences between biofilm-positive organisms and their planktonic forms and detection of genes responsible for biofilm formation. Since each of these methods has limitations, the best results can be achieved by combining different approaches. PMID:16528896

R?zicka, F; Holá, V; Votava, M

2006-02-01

163

[Use of the method of immunothermistography in obstetrical practice].  

PubMed

This paper presents results of testing pregnant women for staphylococcal antigen sensitization using immunothermistography (ITG). The ITG is based on registration of environmental heat conduction change with a microthermistor resistor during antigen-antibody reaction. The study group comprised 75 pregnant women immunized by staphylococcal anaphylotoxin and nonimmunized pregnant and nonpregnant women. Blood alfa-antistaphylolysin levels showed a close direct correlation with ITG findings. Combined use of these methods identified a population of pregnant women requiring immunization. PMID:2204287

Pali?, G K; Berezovskaia, S B

1990-05-01

164

Learning by Doing: Practical Courses in Lightweight Formal Methods using VDM++.  

National Technical Information Service (NTIS)

We describe the design and delivery of two courses that aim to develop skills of use to students in their subsequent professional practice, whether or not they are directly applying formal methods. Both curricula take a 'lightweight' approach, emphasising...

J. S. Fitzgerald P. G. Larsen S. Riddle

2006-01-01

165

Assessment of Certain European Dredging Practices and Dredged Material Containment and Reclamation Methods.  

National Technical Information Service (NTIS)

A study was made of dredging practices, reclamation methods, and environmental effects of dredging in western Europe by visiting more than twenty ports in six countries and discussing pertinent matters with knowledgeable authorities at each port. A remark...

K. d'Angremond J. Brakel A. J. Hoekstra W. C. H. Kleinbloesem L. Nederlof

1978-01-01

166

Improving educational environment in medical colleges through transactional analysis practice of teachers  

PubMed Central

Context: A FAIMER (Foundation for Advancement in International Medical Education and Research) fellow organized a comprehensive faculty development program to improve faculty awareness resulting in changed teaching practices and better teacher student relationships using Transactional Analysis (TA). Practicing TA tools help development of ‘awareness’ about intrapersonal and interpersonal processes. Objectives: To improve self-awareness among medical educators.To bring about self-directed change in practices among medical educators.To assess usefulness of TA tools for the same. Methods: An experienced trainer conducted a basic course (12 hours) in TA for faculty members. The PAC model of personality structure, functional fluency model of personal functioning, stroke theory on motivation, passivity and script theories of adult functional styles were taught experientially with examples from the Medical Education Scenario. Self-reported improvement in awareness and changes in practices were assessed immediately after, at three months, and one year after training. Findings: The mean improvement in self-'awareness' is 13.3% (95% C.I 9.3-17.2) among nineteen participants. This persists one year after training. Changes in practices within a year include, collecting feedback, new teaching styles and better relationship with students. Discussion and Conclusions: These findings demonstrate sustainable and measurable improvement in self-awareness by practice of TA tools. Improvement in self-'awareness' of faculty resulted in self-directed changes in teaching practices. Medical faculty has judged the TA tools effective for improving self-awareness leading to self-directed changes.

Rajan, Marina

2012-01-01

167

Practical Methods for Locating Abandoned Wells in Populated Areas  

SciTech Connect

An estimated 12 million wells have been drilled during the 150 years of oil and gas production in the United States. Many old oil and gas fields are now populated areas where the presence of improperly plugged wells may constitute a hazard to residents. Natural gas emissions from wells have forced people from their houses and businesses and have caused explosions that injured or killed people and destroyed property. To mitigate this hazard, wells must be located and properly plugged, a task made more difficult by the presence of houses, businesses, and associated utilities. This paper describes well finding methods conducted by the National Energy Technology Laboratory (NETL) that were effective at two small towns in Wyoming and in a suburb of Pittsburgh, Pennsylvania.

Veloski, G.A.; Hammack, R.W.; Lynn, R.J.

2007-09-01

168

A practical implicit finite-difference method: examples from seismic modelling  

NASA Astrophysics Data System (ADS)

We derive explicit and new implicit finite-difference formulae for derivatives of arbitrary order with any order of accuracy by the plane wave theory where the finite-difference coefficients are obtained from the Taylor series expansion. The implicit finite-difference formulae are derived from fractional expansion of derivatives which form tridiagonal matrix equations. Our results demonstrate that the accuracy of a (2N + 2)th-order implicit formula is nearly equivalent to that of a (6N + 2)th-order explicit formula for the first-order derivative, and (2N + 2)th-order implicit formula is nearly equivalent to (4N + 2)th-order explicit formula for the second-order derivative. In general, an implicit method is computationally more expensive than an explicit method, due to the requirement of solving large matrix equations. However, the new implicit method only involves solving tridiagonal matrix equations, which is fairly inexpensive. Furthermore, taking advantage of the fact that many repeated calculations of derivatives are performed by the same difference formula, several parts can be precomputed resulting in a fast algorithm. We further demonstrate that a (2N + 2)th-order implicit formulation requires nearly the same memory and computation as a (2N + 4)th-order explicit formulation but attains the accuracy achieved by a (6N + 2)th-order explicit formulation for the first-order derivative and that of a (4N + 2)th-order explicit method for the second-order derivative when additional cost of visiting arrays is not considered. This means that a high-order explicit method may be replaced by an implicit method of the same order resulting in a much improved performance. Our analysis of efficiency and numerical modelling results for acoustic and elastic wave propagation validates the effectiveness and practicality of the implicit finite-difference method.

Liu, Yang; Sen, Mrinal K.

2009-09-01

169

Measuring solar reflectance - Part II: Review of practical methods  

SciTech Connect

A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23 ], and to within 0.02 for surface slopes up to 12:12 [45 ]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R{sub g,0}{sup *}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R{sub g,0}{sup *} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R{sub g,0}{sup *} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R{sub g,0}{sup *} to within about 0.01. (author)

Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul [Heat Island Group, Environmental Energy Technologies Division, Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States)

2010-09-15

170

Designing for scientific data analysis: From practice to prototype  

SciTech Connect

Designers charged with creating tools for processes foreign to their own experience need a reliable source of application knowledge. This dissertation presents an empirical study of the scientific data analysis process in order to inform the design of tools for this important aspect of scientific computing. Interaction analysis and contextual inquiry methods were adapted to observe scientists analyzing their own data and to characterize the scientific dam analysis process. The characterization exposed elements of the process outside the conventional scientific visualization model that defines data analysis in terms of image generation. Scientists queried for quantitative information, made a variety of comparisons, applied mathematics, managed data, and kept records. Many such elements were only indirectly supported by computer. A detailed description of the scientific data analysis process was developed to provide a broad-based foundation of understanding which is rooted in empirical fact, reasonably comprehensive, and applicable to a range of scientific environments. The characterization of scientific data analysis led to design recommendations for improving the support of this process. The application of the results was demonstrated with the design, development, and study of a prototype tool for an inadequately supported scientific dam analysis element. Data culling is the identification and extraction of areas of interest in large scientific data sets. Modern workstation-based analysis tools require manageable subsets of data, but dam culling is not well supported. A prototype tool was designed and developed to explore a quantitative rather than image-based approach to identifying such subsets. Physicist end-users participated throughout the design, development, and evaluation process. The results of evaluations in the field established conditions under which a number-based approach to data selection effectively supplements an image-based approach.

Springmeyer, R.R.

1992-09-01

171

Error analysis in some recent versions of the Fry Method  

NASA Astrophysics Data System (ADS)

Fry method is a graphical technique that directly displays the strain ellipse in form of central vacancy on a point distribution, the Fry plot. For accurate strain estimation from the Fry plot, the central vacancy must appear as a sharply focused perfect ellipse. Diffused nature of the central vacancy, common in practice, induces a subjectivity in direct strain estimation from the Fry plot. Several alternative methods, based on the point density contrast, the image analysis, the Delaunay triangulation, or the point distribution analysis exist for objective strain estimation from the Fry plots. Relative merits and limitations of these methods are, however, not yet well-explored and understood. This study compares the accuracy and efficacy of the six methods proposed for objective determination of strain from Fry plots. Our approach consists of; (i) graphical simulation of variously sorted object sets, (ii) distortion of different object sets by known strain in pure shear, simple shear and simultaneous pure-and-simple shear deformations and, (iii) error analysis and comparison of the six methods. Results from more than 1000 tests reveal that the Delaunay triangulation method, the point density contrast methods or the image analysis method are relatively more accurate and versatile. The amount and nature of distortion, or the degree of sorting have little effect on the accuracy of results in these methods. The point distribution analysis methods are successful provided the pre-deformed objects were well-sorted and defined by the specific types of point distribution. Both the Delaunay triangulation method and the image analysis method are more time efficient in comparison to the point distribution analysis methods. The time-efficiency of the density contrast methods is in between these two extremes.

Srivastava, D. C.; Kumar, R.

2013-12-01

172

Four hour ambulation after angioplasty is a safe practice method  

PubMed Central

BACKGROUND: During the last 3 decades, there were increasing tendency towards angioplasty because of its benefits. But, this procedure has its acute problems like bleeding and formation of hematoma in the removal place of the sheet. Based on researchers’ clinical experiences, patients need a time of 8-12 hours for bed rest after coronary angioplasty. Recognizing desirable time for bed rest after angioplasty and remove the arterial sheet forms the foundation of related researches in the world. Getting out of bed soon after angioplasty, causes more comfortable feelings, less hospitalization period, fewer side effects of prolonged bed rest and less hospitalization expenses. Regarding less time for bed rest after angioplasty, the aim of this study was to assess the effect of the time of getting out of bed after angioplasty on the complications after removing the sheet in coronary angioplasty patients. METHODS: This was an experimental clinical study conducted in one step and two groups. Samples were included 124 angioplasty patients (62 in each group) who were chosen randomly from the CCU of Shahid Chamran hospital of the Isfahan University of Medical Sciences in 2007. Data were gathered by observing and evaluating the patients, using a questionnaire and a checklist. After angioplasty, patients from the intervention group were taken out of bed in 4 hours and patients from the control group were taken out of bed in 8 hours. After taking out of bed, patients were examined for bleeding and formation of hematoma in the place of taking out the arterial sheet. Data were analyzed using descriptive and inferential statistics via SPSS software. RESULTS: Results showed no meaningful difference between the two groups after getting out of bed (p > 0.05) regarding relative frequency of bleeding (p = 0.50), formation of hematoma (p = 0.34) and average diameter of hematoma (p = 0.39). CONCLUSIONS: Results of this study showed that reducing the bed rest time to 4 hours after removing the arterial sheet of size 7 do not increase bleeding and formation of hematoma in the removal place of the sheet. So, those angioplasty patients who do not have critical clinical condition and their vital symptoms are stabilized will be able to get out of bed 4 hours after removing the sheet.

Moeini, Mahin; Moradpour, Fatemeh; Babaei, Sima; Rafieian, Mohsen; Khosravi, Alireza

2010-01-01

173

Methods in quantitative image analysis.  

PubMed

The main steps of image analysis are image capturing, image storage (compression), correcting imaging defects (e.g. non-uniform illumination, electronic-noise, glare effect), image enhancement, segmentation of objects in the image and image measurements. Digitisation is made by a camera. The most modern types include a frame-grabber, converting the analog-to-digital signal into digital (numerical) information. The numerical information consists of the grey values describing the brightness of every point within the image, named a pixel. The information is stored in bits. Eight bits are summarised in one byte. Therefore, grey values can have a value between 0 and 256 (2(8)). The human eye seems to be quite content with a display of 5-bit images (corresponding to 64 different grey values). In a digitised image, the pixel grey values can vary within regions that are uniform in the original scene: the image is noisy. The noise is mainly manifested in the background of the image. For an optimal discrimination between different objects or features in an image, uniformity of illumination in the whole image is required. These defects can be minimised by shading correction [subtraction of a background (white) image from the original image, pixel per pixel, or division of the original image by the background image]. The brightness of an image represented by its grey values can be analysed for every single pixel or for a group of pixels. The most frequently used pixel-based image descriptors are optical density, integrated optical density, the histogram of the grey values, mean grey value and entropy. The distribution of the grey values existing within an image is one of the most important characteristics of the image. However, the histogram gives no information about the texture of the image. The simplest way to improve the contrast of an image is to expand the brightness scale by spreading the histogram out to the full available range. Rules for transforming the grey value histogram of an existing image (input image) into a new grey value histogram (output image) are most quickly handled by a look-up table (LUT). The histogram of an image can be influenced by gain, offset and gamma of the camera. Gain defines the voltage range, offset defines the reference voltage and gamma the slope of the regression line between the light intensity and the voltage of the camera. A very important descriptor of neighbourhood relations in an image is the co-occurrence matrix. The distance between the pixels (original pixel and its neighbouring pixel) can influence the various parameters calculated from the co-occurrence matrix. The main goals of image enhancement are elimination of surface roughness in an image (smoothing), correction of defects (e.g. noise), extraction of edges, identification of points, strengthening texture elements and improving contrast. In enhancement, two types of operations can be distinguished: pixel-based (point operations) and neighbourhood-based (matrix operations). The most important pixel-based operations are linear stretching of grey values, application of pre-stored LUTs and histogram equalisation. The neighbourhood-based operations work with so-called filters. These are organising elements with an original or initial point in their centre. Filters can be used to accentuate or to suppress specific structures within the image. Filters can work either in the spatial or in the frequency domain. The method used for analysing alterations of grey value intensities in the frequency domain is the Hartley transform. Filter operations in the spatial domain can be based on averaging or ranking the grey values occurring in the organising element. The most important filters, which are usually applied, are the Gaussian filter and the Laplace filter (both averaging filters), and the median filter, the top hat filter and the range operator (all ranking filters). Segmentation of objects is traditionally based on threshold grey values. (AB PMID:8781988

Oberholzer, M; Ostreicher, M; Christen, H; Brühlmann, M

1996-05-01

174

Mass spectrometric analysis of neutral sphingolipids: methods, applications, and limitations.  

PubMed

Sphingolipids represent an important class among lipids, especially when considering their vital roles in lipid metabolism. Thus, a variety of methods have been created to accomplish their analysis and the term "sphingolipidomics" has recently been coined to underline the motivation to enable a comprehensive analysis of all sphingolipid species including the acidic and the neutral ones. In this review, we summarize selected mainly biomedical based mass spectrometric approaches for the analysis of neutral sphingolipids regarding their advantages, applications and limitations. To underline some practical aspects of method development, we focus on a new method recently developed in our laboratory, which enables separation, detection, and mass spectrometric profiling of ceramide, hexosylceramide, lactosylceramide, globotriaosylceramide, globotetraosylceramide, sphingomyelin species, and cholesterol in one run. This method can be applied to investigate impairments of neutral sphingolipid metabolism in a variety of disorders such as sphingolipidoses and be employed to screen for sphingolipid profile changes as induced by knockout experiments or related studies. PMID:21740983

Farwanah, Hany; Kolter, Thomas; Sandhoff, Konrad

2011-11-01

175

Method of analysis and quality-assurance practices by the U.S. Geological Survey Organic Geochemistry Research Group; determination of geosmin and methylisoborneol in water using solid-phase microextraction and gas chromatography/mass spectrometry  

USGS Publications Warehouse

A method for the determination of two common odor-causing compounds in water, geosmin and 2-methylisoborneol, was modified and verified by the U.S. Geological Survey's Organic Geochemistry Research Group in Lawrence, Kansas. The optimized method involves the extraction of odor-causing compounds from filtered water samples using a divinylbenzene-carboxen-polydimethylsiloxane cross-link coated solid-phase microextraction (SPME) fiber. Detection of the compounds is accomplished using capillary-column gas chromatography/mass spectrometry (GC/MS). Precision and accuracy were demonstrated using reagent-water, surface-water, and ground-water samples. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 35 nanograms per liter ranged from 60 to 123 percent for geosmin and from 90 to 96 percent for 2-methylisoborneol. Method detection limits were 1.9 nanograms per liter for geosmin and 2.0 nanograms per liter for 2-methylisoborneol in 45-milliliter samples. Typically, concentrations of 30 and 10 nanograms per liter of geosmin and 2-methylisoborneol, respectively, can be detected by the general public. The calibration range for the method is equivalent to concentrations from 5 to 100 nanograms per liter without dilution. The method is valuable for acquiring information about the production and fate of these odor-causing compounds in water.

Zimmerman, L. R.; Ziegler, A. C.; Thurman, E. M.

2002-01-01

176

Organizational climate and hospital nurses' caring practices: a mixed-methods study.  

PubMed

Organizational climate in healthcare settings influences patient outcomes, but its effect on nursing care delivery remains poorly understood. In this mixed-methods study, nurse surveys (N?=?292) were combined with a qualitative case study of 15 direct-care registered nurses (RNs), nursing personnel, and managers. Organizational climate explained 11% of the variation in RNs' reported frequency of caring practices. Qualitative data suggested that caring practices were affected by the interplay of organizational climate dimensions with patients and nurses characteristics. Workload intensity and role ambiguity led RNs to leave many caring practices to practical nurses and assistive personnel. Systemic interventions are needed to improve organizational climate and to support RNs' involvement in a full range of caring practices. PMID:24729389

Roch, Geneviève; Dubois, Carl-Ardy; Clarke, Sean P

2014-06-01

177

Activation analysis methods and applications  

Microsoft Academic Search

Summary  Rib bones of Brazilian people were analyzed by neutron activation analysis to evaluate element composition. Freeze-dried cortical and trabecular tissues, separately, and calcinated total rib tissues were analyzed. The concentrations of the Ba, Br, Ca, Cl, Fe, K, Mg, Mn, Na, P, Rb, Sr, and Zn elements were determined. Comparisons between the results obtained in cortical and trabecular bones indicated

M. K. Takata; M. Saiki; N. M. Sumita; P. H. N. Saldiva; C. A. Pasqualucci

2005-01-01

178

Analysis of Tidal Records. Absolute Filtering Method.  

National Technical Information Service (NTIS)

A new method for the analysis of tidal waves was developed. Owing to the modern technology facilities, continuous and automatic tidal records are available all the time. The filtering method proposed permits eliminating tidal components one by one, in an ...

J. Mateo

1974-01-01

179

A practical method of estimating standard error of age in the fission track dating method  

USGS Publications Warehouse

A first-order approximation formula for the propagation of error in the fission track age equation is given by PA = C[P2s+P2i+P2??-2rPsPi] 1 2, where PA, Ps, Pi and P?? are the percentage error of age, of spontaneous track density, of induced track density, and of neutron dose, respectively, and C is a constant. The correlation, r, between spontaneous are induced track densities is a crucial element in the error analysis, acting generally to improve the standard error of age. In addition, the correlation parameter r is instrumental is specifying the level of neutron dose, a controlled variable, which will minimize the standard error of age. The results from the approximation equation agree closely with the results from an independent statistical model for the propagation of errors in the fission-track dating method. ?? 1979.

Johnson, N. M.; McGee, V. E.; Naeser, C. W.

1979-01-01

180

Current status of methods for shielding analysis  

SciTech Connect

Current methods used in shielding analysis and recent improvements in those methods are discussed. The status of methods development is discussed based on needs cited at the 1977 International Conference on Reactor Shielding. Additional areas where methods development is needed are discussed.

Engle, W.W.

1980-01-01

181

Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture  

DOEpatents

Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

West, Phillip B. (Idaho Falls, ID); Novascone, Stephen R. (Idaho Falls, ID); Wright, Jerry P. (Idaho Falls, ID)

2011-09-27

182

Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture  

DOEpatents

Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

West, Phillip B. (Idaho Falls, ID); Novascone, Stephen R. (Idaho Falls, ID); Wright, Jerry P. (Idaho Falls, ID)

2012-05-29

183

Methods in quantitative image analysis  

Microsoft Academic Search

The main steps of image analysis are image capturing, image storage (compression), correcting imaging defects (e.g. non-uniform illumination, electronic noise, glare effect), image enhancement, segmentation of objects in the image and image measurements. Digitisation is made by a camera. The most modern types include a frame-grabber, converting the analog-to-digital signal into digital (numerical) information. The numerical information consists of the

Martin Oberholzer; Marc Östreicher; Heinz Christen; Marcel Brühlmann

1996-01-01

184

Qualitative Analysis of Common Definitions for Core Advanced Pharmacy Practice Experiences  

PubMed Central

Objective. To determine how colleges and schools of pharmacy interpreted the Accreditation Council for Pharmacy Education’s (ACPE’s) Standards 2007 definitions for core advanced pharmacy practice experiences (APPEs), and how they differentiated community and institutional practice activities for introductory pharmacy practice experiences (IPPEs) and APPEs. Methods. A cross-sectional, qualitative, thematic analysis was done of survey data obtained from experiential education directors in US colleges and schools of pharmacy. Open-ended responses to invited descriptions of the 4 core APPEs were analyzed using grounded theory to determine common themes. Type of college or school of pharmacy (private vs public) and size of program were compared. Results. Seventy-one schools (72%) with active APPE programs at the time of the survey responded. Lack of strong frequent themes describing specific activities for the acute care/general medicine core APPE indicated that most respondents agreed on the setting (hospital or inpatient) but the student experience remained highly variable. Themes were relatively consistent between public and private institutions, but there were differences across programs of varying size. Conclusion. Inconsistencies existed in how colleges and schools of pharmacy defined the core APPEs as required by ACPE. More specific descriptions of core APPEs would help to standardize the core practice experiences across institutions and provide an opportunity for quality benchmarking.

Danielson, Jennifer; Weber, Stanley S.

2014-01-01

185

Qualitative analysis of common definitions for core advanced pharmacy practice experiences.  

PubMed

Objective. To determine how colleges and schools of pharmacy interpreted the Accreditation Council for Pharmacy Education's (ACPE's) Standards 2007 definitions for core advanced pharmacy practice experiences (APPEs), and how they differentiated community and institutional practice activities for introductory pharmacy practice experiences (IPPEs) and APPEs. Methods. A cross-sectional, qualitative, thematic analysis was done of survey data obtained from experiential education directors in US colleges and schools of pharmacy. Open-ended responses to invited descriptions of the 4 core APPEs were analyzed using grounded theory to determine common themes. Type of college or school of pharmacy (private vs public) and size of program were compared. Results. Seventy-one schools (72%) with active APPE programs at the time of the survey responded. Lack of strong frequent themes describing specific activities for the acute care/general medicine core APPE indicated that most respondents agreed on the setting (hospital or inpatient) but the student experience remained highly variable. Themes were relatively consistent between public and private institutions, but there were differences across programs of varying size. Conclusion. Inconsistencies existed in how colleges and schools of pharmacy defined the core APPEs as required by ACPE. More specific descriptions of core APPEs would help to standardize the core practice experiences across institutions and provide an opportunity for quality benchmarking. PMID:24954931

O'Sullivan, Teresa A; Danielson, Jennifer; Weber, Stanley S

2014-06-17

186

Multicultural Issues in School Psychology Practice: A Critical Analysis  

ERIC Educational Resources Information Center

Once thought of largely as a sideline issue, multiculturalism is fast becoming a major topic on the central stage of psychology and practice. That cultural factors permeate the whole of psychological foundations and influence the manner in which the very scope of practice is shaped is undeniable. The rapidly changing face of the U.S. population…

Ortiz, Samuel O.

2006-01-01

187

Researching "Practiced Language Policies": Insights from Conversation Analysis  

ERIC Educational Resources Information Center

In language policy research, "policy" has traditionally been conceptualised as a notion separate from that of "practice". In fact, language practices were usually analysed with a view to evaluate whether a policy is being implemented or resisted to. Recently, however, Spolsky in ("Language policy". Cambridge University press, Cambridge, 2004;…

Bonacina-Pugh, Florence

2012-01-01

188

TECHNICAL DESIGN NOTE: Practical application of an analytical method for calculating a coverage interval  

NASA Astrophysics Data System (ADS)

This article presents a practical application of an analytical method for the calculation of the measurement uncertainty. The proposed method enables the determination of uncertainty in accordance with the new probabilistic definition of the coverage interval for a measurand. The proposed method ensures that the expanded uncertainty is calculated with the recommended number of significant digits at the recommended coverage probability. The method was used for the uncertainty evaluation of measurement of small outer diameters with a laser scanning instrument.

Fotowicz, Pawe?

2010-08-01

189

Best practices: applying management analysis of excellence to immunization.  

PubMed

The authors applied business management tools to analyze and promote excellence and to evaluate differences between average and above-average immunization peformers in private practices. The authors conducted a pilot study of 10 private practices in Pennsylvania using tools common in management to assess practices' organizational climate and managerial style. Authoritative and coaching styles of physician leaders were common to both groups. Managerial styles that emphasized higher levels of clarity and responsibility managerial styles were evident in the large practices; and rewards and flexibility styles were higher in the small above-average practices. The findings of this pilot study match results seen in high performers in other industries. It concludes that the authoritative style appears to have the most impact on performance. It has interesting implications for training/behavior change to improve immunization rates, along with traditional medical interventions. PMID:15921143

Wishner, Amy; Aronson, Jerold; Kohrt, Alan; Norton, Gary

2005-01-01

190

Scholarship and practice: the contribution of ethnographic research methods to bridging the gap  

Microsoft Academic Search

Information systems research methods need to contribute to the scholarly requirements of the field of knowledge but also need to develop the potential to contribute to the practical requirements of practitioners? knowledge. This leads to possible conflicts in choosing research methods. Argues that the changing world of the IS practitioner is reflected in the changing world of the IS researcher

Lynda J. Harvey; Michael D. Myers

1995-01-01

191

Best Practices in Teaching Statistics and Research Methods in the Behavioral Sciences [with CD-ROM  

ERIC Educational Resources Information Center

This book provides a showcase for "best practices" in teaching statistics and research methods in two- and four-year colleges and universities. A helpful resource for teaching introductory, intermediate, and advanced statistics and/or methods, the book features coverage of: (1) ways to integrate these courses; (2) how to promote ethical conduct;…

Dunn, Dana S., Ed.; Smith, Randolph A., Ed.; Beins, Barney, Ed.

2007-01-01

192

The Blending of Andragogical and Pedagogical Teaching Methods in Advanced Social Work Practice Courses  

Microsoft Academic Search

Many social work educators have endorsed an andragogical appraoch to instruction as a means to reach the diverse student population of today's classroom, without recognizing the larger debate and concerns voiced by adult education detractors. Andrgogical methods provide practical experience-related learning opportunities where self-directed learning is emphasized. Although these methods are effective and have improved social work instruction over the

Betty J. Kramer; Rachel Wrenn

1994-01-01

193

Community Analysis-Based Methods  

Microsoft Academic Search

\\u000a Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental\\u000a samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material\\u000a and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host\\u000a fecal microbial communities are distinctive, suggesting that their community profiles can be

Yiping Cao; Cindy H. Wu; Gary L. Andersen; Patricia A. Holden

194

Analysis method of parabolic reflector antenna  

Microsoft Academic Search

This paper describes the analysis method of parabolic reflector antenna. The analysis parameters of the antenna system are optimal antenna diameter, offset height, focal point length, feed horn type and horn size, F\\/D and the coordinate of offset horns. The paper deals with the method to determine design core parameters of optimal antenna diameter, feed horn type and horn size,

Jcom-Hun Lee; Seong-Pal Lee

2005-01-01

195

A Multivariate Method of Commonality Analysis.  

ERIC Educational Resources Information Center

Methods of regression commonality analysis are generalized for use in canonical correlation analysis. An actual data set (involving educators' attitudes toward death and age, locus of control, religion, and occupational role in working with terminally ill children) is employed to illustrate the extension. The method can be applied with respect to…

Thompson, Bruce; Miller, James H.

196

Short time-series microarray analysis: Methods and challenges  

PubMed Central

The detection and analysis of steady-state gene expression has become routine. Time-series microarrays are of growing interest to systems biologists for deciphering the dynamic nature and complex regulation of biosystems. Most temporal microarray data only contain a limited number of time points, giving rise to short-time-series data, which imposes challenges for traditional methods of extracting meaningful information. To obtain useful information from the wealth of short-time series data requires addressing the problems that arise due to limited sampling. Current efforts have shown promise in improving the analysis of short time-series microarray data, although challenges remain. This commentary addresses recent advances in methods for short-time series analysis including simplification-based approaches and the integration of multi-source information. Nevertheless, further studies and development of computational methods are needed to provide practical solutions to fully exploit the potential of this data.

Wang, Xuewei; Wu, Ming; Li, Zheng; Chan, Christina

2008-01-01

197

Counselling in an inner city general practice: analysis of its use and uptake.  

PubMed Central

BACKGROUND. In recognition of the emotional problems which frequently underlie somatic complaints, practices increasingly offer counselling as part of their services to patients. In an inner city practice, a combination of short term counselling, volunteer befriending, community outreach and social work services is offered as a means of responding to the full range of patients' counselling needs. AIM. This study set out to establish the use and uptake of these services. METHOD. A retrospective analysis of patients referred for counselling over one year was carried out. RESULTS. The analysis identified a broad range of emotional problems among referred patients as well as problems of a practical nature. A quarter of the patients referred failed to keep their initial appointments or to complete their contracts. One fifth of the patients were referred on for longer term counselling and/or psychotherapy. Subsequent feedback revealed that preparation of a patient before referral was an important factor affecting uptake of counselling. CONCLUSION. Early assessment of the use and uptake of such services is essential if they are to be integrated successfully and a counsellor's individual skills employed effectively.

Webber, V; Davies, P; Pietroni, P

1994-01-01

198

Statistical and methodological issues in the analysis of complex sample survey data: Practical guidance for trauma researchers  

Microsoft Academic Search

Standard methods for the analysis of survey data assume that the data arise from a simple random sample of the target population. In practice, analysts of survey data sets collected from nationally representative probability samples often pay little attention to important properties of the survey data. Standard statistical software procedures do not allow analysts to take these properties of survey

Brady T. West

2008-01-01

199

A signature analysis method for IC failure analysis  

SciTech Connect

A new method of signature analysis is presented and explained. This method of signature analysis can be based on either experiential knowledge of failure analysis, observed data, or a combination of both. The method can also be used on low numbers of failures or even single failures. It uses the Dempster-Shafer theory to calculate failure mechanism confidence. The model is developed in the paper and an example is given for its use. 9 refs., 5 figs., 9 tabs.

Henderson, C.L.; Soden, J.M.

1996-10-01

200

The Practice Boundaries of Advanced Practice Nurses: An Economic and Legal Analysis  

Microsoft Academic Search

The purpose of this study is to examine the effects of State regulation that determines the extent of professional independence of advanced practice nurses (APNs). We find that in States where APNs have acquired a substantial amount of professional independence, the earnings of APNs are substantially lower, and those of physicians’ assistants (PAs) are substantially higher, than in other States.

Michael J. Dueker; Ada K. Jacox; David E. Kalist; Stephen J. Spurr

2005-01-01

201

Internet Practices of Certified Rehabilitation Counselors and Analysis of Guidelines for Ethical Internet Practices  

ERIC Educational Resources Information Center

The Internet has become an integral part of the practice of rehabilitation counseling. To identify potential ethical issues regarding the use of the Internet by counselors, two studies were conducted. In Study 1, we surveyed a national sample of rehabilitation counselors regarding their use of technology in their work and home settings. Results…

Lehmann, Ilana S.; Crimando, William

2011-01-01

202

Probabilistic structural analysis by extremum methods  

NASA Technical Reports Server (NTRS)

The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

Nafday, Avinash M.

1990-01-01

203

A mixed-methods approach to investigating the adoption of evidence-based pain practices in nursing homes.  

PubMed

This mixed methods study examined perceived facilitators and obstacles to adopting evidence-based pain management protocols vis-a-vis documented practice changes that were measured using a chart audit tool. This analysis used data from a subgroup of four nursing homes that participated in a clinical trial. Focus group interviews with staff yielded qualitative data about perceived factors that affected their willingness and ability to use the protocols. Chart audits determined whether pain assessment and management practices changed over time in light of these reported facilitators and barriers. Reported facilitators included administrative support, staff consistency, and policy and procedure changes. Barriers were staff attitudes, regulatory issues, and provider mistrust of nurses' judgment. Overall, staff reported improvements in pain practices. These reports were corroborated by modest but significant increases in adherence to recommended practices. Change in clinical practice is complex and requires attention to both structural and process aspects of care. [Journal of Gerontological Nursing, 40(7), 52-60.]. PMID:24640959

Ersek, Mary; Jablonski, Anita

2014-07-01

204

A Practical Method for Determining the Corten-Dolan Exponent and Its Application to Fatigue Life Prediction  

NASA Astrophysics Data System (ADS)

Based on the derivation and calculation of the Corten-Dolan exponent d, a practical method of determining its value is proposed. This exponent depends not only upon the materials, but also upon the load spectrums. Therefore its value is obtained by a function which decreases with increasing stress amplitude. This exponent was investigated through analysis of fatigue damage evolution to determine its parameters. The proposed method has been effectively proved by experimental data from literature. Utilization of the modified Corten-Dolan's model significantly improves its life prediction capability when compared to the conventional model where that exponent was assumed to be constant.

Zhu, Shun-Peng; Huang, Hong-Zhong; Liu, Yu; He, Li-Ping; Liao, Qiang

2012-06-01

205

Common Goals for the Science and Practice of Behavior Analysis: A Response to Critchfield  

ERIC Educational Resources Information Center

In his scholarly and thoughtful article, "Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis," Critchfield (2011) discussed the science-practice frictions to be expected in any professional organization that attempts to combine these interests. He suggested that the Association for Behavior Analysis

Schneider, Susan M.

2012-01-01

206

Accuracy analysis of Stewart platform based on interval analysis method  

NASA Astrophysics Data System (ADS)

A Stewart platform is introduced in the 500 m aperture spherical radio telescope(FAST) as an accuracy adjustable mechanism for feed receivers. Accuracy analysis is the basis of accuracy design. However, a rapid and effective accuracy analysis method for parallel manipulator is still needed. In order to enhance solution efficiency, an interval analysis method(IA method) is introduced to solve the terminal error bound of the Stewart platform with detailed solution path. Taking a terminal pose of the Stewart platform in FAST as an example, the terminal error is solved by the Monte Carlo method(MC method) by 4 980 s, the stochastic mathematical method(SM method) by 0.078 s, and the IA method by 2.203 s. Compared with MC method, the terminal error by SM method leads a 20% underestimate while the IA method can envelop the real error bound of the Stewart platform. This indicates that the IA method outperforms the other two methods by providing quick calculations and enveloping the real error bound of the Stewart platform. According to the given structural error of the dimension parameters of the Stewart platform, the IA method gives a maximum position error of 19.91 mm and maximum orientation error of 0.534°, which suggests that the IA method can be used for accuracy design of the Stewart platform in FAST. The IA method presented is a rapid and effective accuracy analysis method for Stewart platform.

Yao, Rui; Zhu, Wenbai; Huang, Peng

2013-01-01

207

Degradation of learned skills: Effectiveness of practice methods on visual approach and landing skill retention  

NASA Technical Reports Server (NTRS)

Flight control and procedural task skill degradation, and the effectiveness of retraining methods were evaluated for a simulated space vehicle approach and landing under instrument and visual flight conditions. Fifteen experienced pilots were trained and then tested after 4 months either without the benefits of practice or with static rehearsal, dynamic rehearsal or with dynamic warmup practice. Performance on both the flight control and procedure tasks degraded significantly after 4 months. The rehearsal methods effectively countered procedure task skill degradation, while dynamic rehearsal or a combination of static rehearsal and dynamic warmup practice was required for the flight control tasks. The quality of the retraining methods appeared to be primarily dependent on the efficiency of visual cue reinforcement.

Sitterley, T. E.; Zaitzeff, L. P.; Berge, W. A.

1972-01-01

208

78 FR 11611 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...  

Federal Register 2010, 2011, 2012, 2013

...Manufacturing Practice and Hazard Analysis and Risk-Based Preventive...Manufacturing Practice and Hazard Analysis and Risk-Based Preventive...Federal Food, Drug, and Cosmetic Act to establish and implement hazard analysis and risk-based...

2013-02-19

209

Methods for Analysis of Outdoor Performance Data (Presentation)  

SciTech Connect

The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted to power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders--utility companies, integrators, investors, and scientists alike. Different methods to determine degradation rates and discrete versus continuous data are presented, and some general best practice methods are outlined. In addition, historical degradation rates and some preliminary analysis with respect to climate are given.

Jordan, D.

2011-02-01

210

Applications of Automation Methods for Nonlinear Fracture Test Analysis  

NASA Technical Reports Server (NTRS)

Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

Allen, Phillip A.; Wells, Douglas N.

2013-01-01

211

On exploratory factor analysis: a review of recent evidence, an assessment of current practice, and recommendations for future use.  

PubMed

Exploratory factor analysis (hereafter, factor analysis) is a complex statistical method that is integral to many fields of research. Using factor analysis requires researchers to make several decisions, each of which affects the solutions generated. In this paper, we focus on five major decisions that are made in conducting factor analysis: (i) establishing how large the sample needs to be, (ii) choosing between factor analysis and principal components analysis, (iii) determining the number of factors to retain, (iv) selecting a method of data extraction, and (v) deciding upon the methods of factor rotation. The purpose of this paper is threefold: (i) to review the literature with respect to these five decisions, (ii) to assess current practices in nursing research, and (iii) to offer recommendations for future use. The literature reviews illustrate that factor analysis remains a dynamic field of study, with recent research having practical implications for those who use this statistical method. The assessment was conducted on 54 factor analysis (and principal components analysis) solutions presented in the results sections of 28 papers published in the 2012 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. The main findings from the assessment were that researchers commonly used (a) participants-to-items ratios for determining sample sizes (used for 43% of solutions), (b) principal components analysis (61%) rather than factor analysis (39%), (c) the eigenvalues greater than one rule and screen tests to decide upon the numbers of factors/components to retain (61% and 46%, respectively), (d) principal components analysis and unweighted least squares as methods of data extraction (61% and 19%, respectively), and (e) the Varimax method of rotation (44%). In general, well-established, but out-dated, heuristics and practices informed decision making with respect to the performance of factor analysis in nursing studies. Based on the findings from factor analysis research, it seems likely that the use of such methods may have had a material, adverse effect on the solutions generated. We offer recommendations for future practice with respect to each of the five decisions discussed in this paper. PMID:24183474

Gaskin, Cadeyrn J; Happell, Brenda

2014-03-01

212

Education Policy as a Practice of Power: Theoretical Tools, Ethnographic Methods, Democratic Options  

ERIC Educational Resources Information Center

This article outlines some theoretical and methodological parameters of a critical practice approach to policy. The article discusses the origins of this approach, how it can be uniquely adapted to educational analysis, and why it matters--not only for scholarly interpretation but also for the democratization of policy processes as well. Key to…

Levinson, Bradley A. U.; Sutton, Margaret; Winstead, Teresa

2009-01-01

213

An evaluation of fracture analysis methods  

NASA Technical Reports Server (NTRS)

The results of an experimental and predictive round robin on the applications of fracture analysis methods are presented. The objective of the round robin was to verify whether fracture analysis methods currently in use can or cannot predict failure loads on complex structural components containing cracks. Fracture results from tests on a number of compact specimens were used to make the predictions. The accuracy of the prediction methods was evaluated in terms of the variation in the ratio of predicted to experimental failure loads, and the predictions methods are ranked in order of minimum standard error. A range of applicability of the different methods was also considered in assessing their usefulness. For 7075-T651 aluminum alloy, the best methods were: the effective K sub R curve; the critical crack-tip opening displacement (CTOD) criterion using a finite element analysis; and the K sub R curve with the Dugdale model. For the 2024-T351 aluminum alloy, the best methods included: the two-parameter fracture criterion (TPFC); the CTOD parameter using finite element analysis; the K-curve with the Dugdale model; the deformation plasticity failure assessment diagram (DPFAD); and the effective K sub R curve with a limit load condition. For 304 stainless steel, the best methods were the limit load analysis; the CTOD criterion using finite-element analysis TPFC and DPFAD. Some sample experimental results are given in an appendix.

Newman, J. C., Jr.

1985-01-01

214

Practical implementation of dynamic methods for measuring atomic force microscope cantilever spring constants  

NASA Astrophysics Data System (ADS)

Measurement of atomic force microscope cantilever spring constants (k) is essential for many of the applications of this versatile instrument. Numerous techniques to measure k have been proposed. Among these, we found the thermal noise and Sader methods to be commonly applicable and relatively user-friendly, providing an in situ, non-destructive, fast measurement of k for a cantilever independent of its material or coating. Such advantages recommend these methods for widespread use. An impediment thereto is the significant complication involved in the initial implementation of the methods. Some details of the implementation are discussed in publications, while others are left unsaid. Here we present a complete, cohesive, and practically oriented discussion of the implementation of both the thermal noise and Sader methods of measuring cantilever spring constants. We review the relevant theory and discuss practical experimental means for determining the required quantities. We then present results that compare measurements of k by these two methods over nearly two orders of magnitude, and we discuss the likely origins of both statistical and systematic errors for both methods. In conclusion, we find that the two methods agree to within an average of 4% over the wide range of cantilevers measured. Given that the methods derive from distinct physics we find the agreement a compelling argument in favour of the accuracy of both, suggesting them as practical standards for the field.

Cook, S. M.; Lang, K. M.; Chynoweth, K. M.; Wigton, M.; Simmonds, R. W.; Schäffer, T. E.

2006-05-01

215

A spatial analysis of the expanding roles of nurses in general practice  

PubMed Central

Background Changes to the workforce and organisation of general practice are occurring rapidly in response to the Australian health care reform agenda, and the changing nature of the medical profession. In particular, the last five years has seen the rapid introduction and expansion of a nursing workforce in Australian general practices. This potentially creates pressures on current infrastructure in general practice. Method This study used a mixed methods, ‘rapid appraisal’ approach involving observation, photographs, and interviews. Results Nurses utilise space differently to GPs, and this is part of the diversity they bring to the general practice environment. At the same time their roles are partly shaped by the ways space is constructed in general practices. Conclusion The fluidity of nursing roles in general practice suggests that nurses require a versatile space in which to maximize their role and contribution to the general practice team.

2012-01-01

216

Pharmacogenetics: practices and opportunities for study design and data analysis.  

PubMed

Pharmacogenetics (PGx) is increasingly used as a way to target treatment to patients who are most likely to benefit. To date, PGx has shown clinical significance across a few applications but widespread use has been limited by the need for further technical, methodological and practical advances and for educating clinical researchers on the value of PGx. Here, I describe the current scope of PGx research, including recent contributions to prospective study design. A case study is included to demonstrate the limitations of current practice and to describe some practical steps for improving the chances of identifying genetic effects. Additionally, I describe some opportunities for the integration and application of disparate data sources in exploratory PGx research. PMID:21875683

Flynn, Aiden A

2011-10-01

217

Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture  

SciTech Connect

Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

Sanfilippo, Antonio P. (Richland, WA); Cowell, Andrew J. (Kennewick, WA); Gregory, Michelle L. (Richland, WA); Baddeley, Robert L. (Richland, WA); Paulson, Patrick R. (Pasco, WA); Tratz, Stephen C. (Richland, WA); Hohimer, Ryan E. (West Richland, WA)

2012-03-20

218

Which practices are high antibiotic prescribers? A cross-sectional analysis  

PubMed Central

Background Substantial variation in antibiotic prescribing rates between general practices persists, but remains unexplained at national level. Aim To establish the degree of variation in antibiotic prescribing between practices in England and identify the characteristics of practices that prescribe higher volumes of antibiotics. Design of study Cross-sectional study. Setting 8057 general practices in England. Method A dataset was constructed containing data on standardised antibiotic prescribing volumes, practice characteristics, patient morbidity, ethnicity, social deprivation, and Quality and Outcomes Framework achievement (2004–2005). Data were analysed using multiple regression modelling. Results There was a twofold difference in standardised antibiotic prescribing volumes between practices in the 10th and 90th centiles of the sample (0.48 versus 0.95 antibiotic prescriptions per antibiotic STAR-PU [Specific Therapeutic group Age-sex weightings-Related Prescribing Unit]). A regression model containing nine variables explained 17.2% of the variance in antibiotic prescribing. Practice location in the north of England was the strongest predictor of high antibiotic prescribing. Practices serving populations with greater morbidity and a higher proportion of white patients prescribed more antibiotics, as did practices with shorter appointments, non-training practices, and practices with higher proportions of GPs who were male, >45 years of age, and qualified outside the UK. Conclusion Practice and practice population characteristics explained about one-sixth of the variation in antibiotic prescribing nationally. Consultation-level and qualitative studies are needed to help further explain these findings and improve our understanding of this variation.

Wang, Kay Yee; Seed, Paul; Schofield, Peter; Ibrahim, Saima; Ashworth, Mark

2009-01-01

219

Practical method for evaluating the visibility of moire patterns for CRT design  

NASA Astrophysics Data System (ADS)

The high resolution CRT displays used for computer monitor and high performance TV often produce a pattern of bright and dark stripes on the screen called a moire pattern. The elimination of the moire is an important consideration in the CRT design. The objective of this study is to provide a practical method for estimating and evaluating a moire pattern considering the visibility by the human vision. On the basis of the mathematical model of a moire generation, precise value of the period and the intensity of a moire are calculated from the actual data of the electron beam profile and the transmittance distribution of apertures of the shadow mask. The visibility of the moire is evaluated by plotting the calculation results on the contrast-period plane, which consists of visible and invisible moire pattern regions based on experimental results of the psychological tests. Not only fundamental design parameters such as a shadow mask pitch and a scanning line pitch but also details of an electron beam profile such as a distortion or an asymmetry can be examined. In addition to the analysis, the image simulation of a moire using the image memory is also available.

Shiramatsu, Naoki; Tanigawa, Masashi; Iwata, Shuji

1995-04-01

220

Sensitive Fluorometric Method for Tissue Tocopherol Analysis.  

National Technical Information Service (NTIS)

A sensitive, highly reproducible method for tissue tocopherol analysis that combines saponification in the presence of large amounts of ascorbic acid to remove interfering substances, extraction of the nonsaponifiable lipids with hexane, and fluorometric ...

S. L. Taylor M. P. Lamden A. L. Tappel

1976-01-01

221

Method of analysis and quality-assurance practices for determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry at the U.S. Geological Survey California District Organic Chemistry Laboratory, 1996-99  

USGS Publications Warehouse

A method of analysis and quality-assurance practices were developed to study the fate and transport of pesticides in the San Francisco Bay-Estuary by the U.S. Geological Survey. Water samples were filtered to remove suspended-particulate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide and the pesticides were eluted with three cartridge volumes of hexane:diethyl ether (1:1) solution. The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for pesticides ranged from 0.002 to 0.025 microgram per liter for 1-liter samples. Recoveries ranged from 44 to 140 percent for 25 pesticides in samples of organic-free reagent water and Sacramento-San Joaquin Delta and Suisun Bay water fortified at 0.05 and 0.50 microgram per liter. The estimated holding time for pesticides after extraction on C-8 solid-phase extraction cartridges ranged from 10 to 257 days.

Crepeau, Kathryn L.; Baker, Lucian M.; Kuivila, Kathryn M.

2000-01-01

222

Mixed-methods research in pharmacy practice: basics and beyond (part 1).  

PubMed

This is the first of two papers which explore the use of mixed-methods research in pharmacy practice. In an era of evidence-based medicine and policy, high-quality research evidence is essential for the development of effective pharmacist-led services. Over the past decade, the use of mixed-methods research has become increasingly common in healthcare, although to date its use has been relatively limited in pharmacy practice research. In this article, the basic concepts of mixed-methods research including its definition, typologies and advantages in relation to pharmacy practice research are discussed. Mixed-methods research brings together qualitative and quantitative methodologies within a single study to answer or understand a research problem. There are a number of mixed-methods designs available, but the selection of an appropriate design must always be dictated by the research question. Importantly, mixed-methods research should not be seen as a 'tool' to collect qualitative and quantitative data, rather there should be some degree of 'integration' between the two data sets. If conducted appropriately, mixed-methods research has the potential to generate quality research evidence by combining strengths and overcoming the respective limitations of qualitative and quantitative methodologies. PMID:23418918

Hadi, Muhammad Abdul; Alldred, David Phillip; Closs, S José; Briggs, Michelle

2013-10-01

223

Urinary tract infection in general practice: Direct antibiotic sensitivity testing as a potential diagnostic method  

Microsoft Academic Search

Direct Antibiotic Sensitivity Testing (DST) is a rapid means of diagnosing urinary tract infection (UTI) and obtaining antibiotic sensitivity patterns of the infecting organisms. In this study 227 urine samples from general practice were analysed using this technique and the results obtained were compared with those obtained using the standard laboratory method. DST was shown to be 94.6% sensitive, and

P. G. Scully; B. O’Shea; K. P. Flanagan; F. R. Falkiner

1990-01-01

224

Methods for estimating transient performance of practical current transformers for relaying  

Microsoft Academic Search

Since engineers learned, about 50 years ago, that a current transformer has real difficulty with an initially fully offset transient current, they have written many papers about the difficulty without finding a solution for most practical relay problems. This paper offers new data and methods which are intended to lead the relay engineer to solutions to many problems and to

E. E. Conner; E. C. Wentz; D. W. Allen

1975-01-01

225

Econometric Methods for Causal Evaluation of Education Policies and Practices: A Non-Technical Guide  

ERIC Educational Resources Information Center

Education policy-makers and practitioners want to know which policies and practices can best achieve their goals. But research that can inform evidence-based policy often requires complex methods to distinguish causation from accidental association. Avoiding econometric jargon and technical detail, this paper explains the main idea and intuition…

Schlotter, Martin; Schwerdt, Guido; Woessmann, Ludger

2011-01-01

226

National Survey of Psychologists' Test Feedback Training, Supervision, and Practice: A Mixed Methods Study  

Microsoft Academic Search

In this empirical, mixed methods study, we explored test feedback training, supervision, and practice among psychologists, focusing specifically on how feedback is provided to clients and whether feedback skills are taught in graduate programs. Based on a 48.5% return rate, this national survey of clinical, counseling, and school psychologists' suggests psychologists provide test feedback to clients but inconsistently. Most respondents,

Kyle T. Curry; William E. Hanson

2010-01-01

227

Passive Sampling Methods for Contaminated Sediments: Practical Guidance for Selection, Calibration, and Implementation  

EPA Science Inventory

This article provides practical guidance on the use of passive sampling methods(PSMs) that target the freely dissolved concentration (Cfree) for improved exposure assessment of hydrophobic organic chemicals in sediments. Primary considerations for selecting a PSM for a specific a...

228

Using the Patient as Teacher: A Training Method for Family Practice Residents in Behavioral Science  

Microsoft Academic Search

Since the inception of family medicine as a specialty in allopathy and osteopathy in 1969 and 1973, respectively, there has been a need to develop integrative approaches of teaching behavioral science concepts without violating the scope of practice limitations between the fields. We describe a collaborative training method by which we attempt to achieve this balance. Residents referring patients for

Janis L. Lewis; DeVon R. Stokes; Lawrence R. Fischetti; Aaron L. Rutledge

1988-01-01

229

Econometric Methods for Causal Evaluation of Education Policies and Practices: A Non-Technical Guide  

Microsoft Academic Search

Education policy-makers and practitioners want to know which policies and practices can best achieve their goals. But research that can inform evidence-based policy often requires complex methods to distinguish causation from accidental association. Avoiding econometric jargon and technical detail, this paper explains the main idea and intuition of leading empirical strategies devised to identify causal impacts and illustrates their use

Martin Schlotter; Guido Schwerdt; Ludger Woessmann

2011-01-01

230

Developing a clinical hypermedia corpus: experiences from the use of a practice-centered method.  

PubMed Central

This paper outlines a practice-centered method for creation of a hypermedia corpus. It also describes experiences with creating such a corpus of information to support interprofessional work at a Primary Healthcare Center. From these experiences, a number of basic issues regarding information systems development within medical informatics will be discussed.

Timpka, T.; Nyce, J. M.; Sjoberg, C.; Hedblom, P.; Lindblom, P.

1992-01-01

231

Practical use of three-dimensional inverse method for compressor blade design  

Microsoft Academic Search

The practical utility of a three-dimensional inverse viscous method is demonstrated by carrying out a design modification of a first-stage rotor in an industrial compressor. In this design modification study, the goal is to improve the efficiency of the original blade while retaining its overall aerodynamic, structural, and manufacturing characteristics. By employing a simple modification to the blade pressure loading

S. Damle; T. Dang; J. Stringham; E. Razinsky

1999-01-01

232

A qualitative analysis of case managers' use of harm reduction in practice.  

PubMed

The harm reduction approach has become a viable framework within the field of addictions, yet there is limited understanding about how this approach is implemented in practice. For people who are homeless and have co-occurring psychiatric and substance use disorders, the Housing First model has shown promising results in employing such an approach. This qualitative study utilizes ethnographic methods to explore case managers' use of harm reduction within Housing First with a specific focus on the consumer-provider relationship. Analysis of observational data and in-depth interviews with providers and consumers revealed how communication between the two regarding the consumer's substance use interacted with the consumer-provider relationship. From these findings emerged a heuristic model of harm reduction practice that highlighted the profound influence of relationship quality on the paths of communication regarding substance use. This study provides valuable insight into how harm reduction is implemented in clinical practice that ultimately has public health implications in terms of more effectively addressing high rates of addiction that contribute to homelessness and health disparities. PMID:22520277

Tiderington, Emmy; Stanhope, Victoria; Henwood, Benjamin F

2013-01-01

233

[Framework analysis method in qualitative research].  

PubMed

In recent years a number of qualitative research methods have gained popularity within the health care arena. Despite this popularity, different qualitative analysis methods pose many challenges to most researchers. The present paper responds to the needs expressed by recent Chinese medicine researches. The present paper is mainly focused on the concepts, nature, application of framework analysis, especially on how to use it, in such a way to assist the newcomer of Chinese medicine researchers to engage with the methodology. PMID:24941857

Liao, Xing; Liu, Jian-ping; Robison, Nicola; Xie, Ya-ming

2014-05-01

234

Airbreathing hypersonic vehicle design and analysis methods  

NASA Technical Reports Server (NTRS)

The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

1996-01-01

235

A Practical Test Method for Mode I Fracture Toughness of Adhesive Joints with Dissimilar Substrates  

SciTech Connect

A practical test method for determining the mode I fracture toughness of adhesive joints with dissimilar substrates will be discussed. The test method is based on the familiar Double Cantilever Beam (DCB) specimen geometry, but overcomes limitations in existing techniques that preclude their use when testing joints with dissimilar substrates. The test method is applicable to adhesive joints where the two bonded substrates have different flexural rigidities due to geometric and/or material considerations. Two specific features discussed are the use of backing beams to prevent substrate damage and a compliance matching scheme to achieve symmetric loading conditions. The procedure is demonstrated on a modified DCB specimen comprised of SRIM composite and thin-section, e-coat steel substrates bonded with an epoxy adhesive. Results indicate that the test method provides a practical means of characterizing the mode I fracture toughness of joints with dissimilar substrates.

Boeman, R.G.; Erdman, D.L.; Klett, L.B.; Lomax, R.D.

1999-09-27

236

An Analysis of Farm Injuries and Safety Practices in Mississippi  

Microsoft Academic Search

In Mississippi, agriculture is the most dangerous industry employing over 30% of the state's workforce. Records from the Mississippi Cooperative Extension Service indicated that 18 tractor deaths occurred in 1997, a new all-time record. Also, there were two additional deaths involving other farm machinery. This study was designed to determine the magnitude of farm injuries, safety practices, and educational programs

Carey L. Ford; Terence L. Lynch

2000-01-01

237

An Analysis of Teacher Practices with Toddlers during Social Conflicts  

ERIC Educational Resources Information Center

Employing a quasi-experimental design, this pilot study on teacher practices with toddlers during social conflicts was conducted in the southeastern USA. Four child-care classrooms, teachers (n?=?8) and children (n?=?51) were assessed with the Classroom Assessment Scoring System -- Toddler [CLASS-Toddler; La Paro, K., Hamre, B. K., & Pianta,…

Gloeckler, Lissy R.; Cassell, Jennifer M.; Malkus, Amy J.

2014-01-01

238

Analysis of factors influencing project cost estimating practice  

Microsoft Academic Search

Although extensive research has been undertaken on factors influencing the decision to tender and mark-up and tender price determination for construction projects, very little of this research contains information appropriate to the factors involved in costing construction projects. The object of this study was to gain an understanding of the factors influencing contractors' cost estimating practice. This was achieved through

Akintola Akintoye

2000-01-01

239

Initial Public Offerings: An Analysis of Theory and Practice  

Microsoft Academic Search

We survey 336 chief financial officers (CFOs) to compare practice to theory in the areas of initial public offering (IPO) motivation, timing, underwriter selection, underpricing, signaling, and the decision to remain private. We find the primary motivation for going public is to facilitate acquisitions. CFOs base IPO timing on overall market conditions, are well informed regarding expected underpricing, and feel

JAMES C. BRAU; STANLEY E. FAWCETT

2006-01-01

240

Mentoring Beginning Teachers in Secondary Schools: An Analysis of Practice  

ERIC Educational Resources Information Center

The conditions that promote best practice in the mentoring of beginning teachers in secondary schools are explored in this paper in relation to the experiential model of learning put forward by Kolb [(1984). "Experiential learning: Experience as the source of learning and development." New York: Prentice-Hall]. The underpinning processes of this…

Harrison, Jennifer; Dymoke, Sue; Pell, Tony

2006-01-01

241

Practical Implementation of New Particle Tracking Method to the Real Field of Groundwater Flow and Transport.  

PubMed

In articles published in 2009 and 2010, Suk and Yeh reported the development of an accurate and efficient particle tracking algorithm for simulating a path line under complicated unsteady flow conditions, using a range of elements within finite elements in multidimensions. Here two examples, an aquifer storage and recovery (ASR) example and a landfill leachate migration example, are examined to enhance the practical implementation of the proposed particle tracking method, known as Suk's method, to a real field of groundwater flow and transport. Results obtained by Suk's method are compared with those obtained by Pollock's method. Suk's method produces superior tracking accuracy, which suggests that Suk's method can describe more accurately various advection-dominated transport problems in a real field than existing popular particle tracking methods, such as Pollock's method. To illustrate the wide and practical applicability of Suk's method to random-walk particle tracking (RWPT), the original RWPT has been modified to incorporate Suk's method. Performance of the modified RWPT using Suk's method is compared with the original RWPT scheme by examining the concentration distributions obtained by the modified RWPT and the original RWPT under complicated transient flow systems. PMID:22476629

Suk, Heejun

2012-01-01

242

Laboratory theory and methods for sediment analysis  

USGS Publications Warehouse

The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.

Guy, Harold P.

1969-01-01

243

Optimisation of Lime-Soda process parameters for reduction of hardness in aqua-hatchery practices using Taguchi methods.  

PubMed

This paper presents the optimisation of Lime-Soda process parameters for the reduction of hardness in aqua-hatchery practices in the context of M. rosenbergii. The fresh water in the development of fisheries needs to be of suitable quality. Lack of desirable quality in available fresh water is generally the confronting restraint. On the Indian subcontinent, groundwater is the only source of raw water, having varying degree of hardness and thus is unsuitable for the fresh water prawn hatchery practices (M. rosenbergii). In order to make use of hard water in the context of aqua-hatchery, Lime-Soda process has been recommended. The efficacy of the various process parameters like lime, soda ash and detention time, on the reduction of hardness needs to be examined. This paper proposes to determine the parameter settings for the CIFE well water, which is pretty hard by using Taguchi experimental design method. Orthogonal Arrays of Taguchi, Signal-to-Noise Ratio, the analysis of variance (ANOVA) have been applied to determine their dosage and analysed for their effect on hardness reduction. The tests carried out with optimal levels of Lime-Soda process parameters confirmed the efficacy of the Taguchi optimisation method. Emphasis has been placed on optimisation of chemical doses required to reduce the total hardness using Taguchi method and ANOVA, to suit the available raw water quality for aqua-hatchery practices, especially for fresh water prawn M. rosenbergii. PMID:24749379

Yavalkar, S P; Bhole, A G; Babu, P V Vijay; Prakash, Chandra

2012-04-01

244

Scientometric analysis of geostatistics using multivariate methods  

Microsoft Academic Search

Multivariate methods were successfully employed in a comprehensive scientometric analysis of geostatistics research, and the\\u000a publications data for this research came from the Science Citation Index and spanned the period from 1967 to 2005. Hierarchical\\u000a cluster analysis (CA) was used in publication patterns based on different types of variables. A backward discriminant analysis\\u000a (DA) with appropriate statistical tests was then

Feng Zhou; Huai-cheng Guo; Yuh-shan Ho; Chao-zhong Wu

2007-01-01

245

Practical guidelines to select and scale earthquake records for nonlinear response history analysis of structures  

USGS Publications Warehouse

Earthquake engineering practice is increasingly using nonlinear response history analysis (RHA) to demonstrate performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. Presented herein is a modal-pushover-based scaling (MPS) method to scale ground motions for use in nonlinear RHA of buildings and bridges. In the MPS method, the ground motions are scaled to match (to a specified tolerance) a target value of the inelastic deformation of the first-'mode' inelastic single-degree-of-freedom (SDF) system whose properties are determined by first-'mode' pushover analysis. Appropriate for first-?mode? dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-'mode' SDF system in selecting a subset of the scaled ground motions. Based on results presented for two bridges, covering single- and multi-span 'ordinary standard' bridge types, and six buildings, covering low-, mid-, and tall building types in California, the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.

Kalkan, Erol; Chopra, Anil K.

2010-01-01

246

A practical approach to fire hazard analysis for offshore structures.  

PubMed

Offshore quantitative risk assessments (QRA) have historically been complex and costly. For large offshore design projects, the level of detail required for a QRA is often not available until well into the detailed design phase of the project. In these cases, the QRA may be unable to provide timely hazard understanding. As a result, the risk reduction measures identified often come too late to allow for cost effective changes to be implemented. This forces project management to make a number of difficult or costly decisions. This paper demonstrates how a scenario-based approached to fire risk assessment can be effectively applied early in a project's development. The scenario or design basis fire approach calculates the consequence of a select number of credible fire scenarios, determines the potential impact on the platform process equipment, structural members, egress routes, safety systems, and determines the effectiveness of potential options for mitigation. The early provision of hazard data allows the project team to select an optimum design that is safe and will meet corporate or regulatory risk criteria later in the project cycle. The focus of this paper is on the application of the scenario-based approach to gas jet fires. This paper draws on recent experience in the Gulf of Mexico (GOM) and other areas to outline an approach to fire hazard analysis and fire hazard management for deep-water structures. The methods presented will include discussions from the recent June 2002 International Workshop for Fire Loading and Response. PMID:14602403

Krueger, Joel; Smith, Duncan

2003-11-14

247

Analysis of the random decrement method  

NASA Technical Reports Server (NTRS)

Random decrement signatures have been introduced for use in the determination of the frequencies of oscillation, damping ratios and modes of vibration of structural systems from naturally induced vibration data. This paper gives an analysis of the random decrement method and the conditions which must be satisfied in order for the method to yield consistent estimates.

Huan, S.-L.; Mcinnis, B. C.; Denman, E. D.

1983-01-01

248

Causal Moderation Analysis Using Propensity Score Methods  

ERIC Educational Resources Information Center

This paper is based on previous studies in applying propensity score methods to study multiple treatment variables to examine the causal moderator effect. The propensity score methods will be demonstrated in a case study to examine the causal moderator effect, where the moderators are categorical and continuous variables. Moderation analysis is an…

Dong, Nianbo

2012-01-01

249

Method and Apparatus for Implement Xanes Analysis.  

National Technical Information Service (NTIS)

Compact, low-power-consuming systems and methods for exposing samples to high-energy radiation, for example, for exposing samples to x-rays for implementing x-ray absorption near edge analysis (XANES). The systems and methods include a low-power-consuming...

W. Gibson Z. Chen

2005-01-01

250

BUILDINGS WITH LOCAL ISOLATION SYSTEM: PERFORMANCE AND SIMPLIFIED METHOD OF DYNAMIC ANALYSIS  

Microsoft Academic Search

Mass isolation is a method of structural vibration control against environmental loads such as strong earthquakes. Buildings with local isolation systems are a practical method of mass isolation. These buildings can be non-proportionally damped systems. Available methods for dynamic analysis of these buildings are complex and time-consuming. In this paper, the concept and efficiency of these buildings at seismic response

H. Pourmohammad; M. Ghafory; M. Ziyaeifarb Ashtianyb

251

Computer methods for investigating statistical regularities in problems of statistical data analysis and reliability  

Microsoft Academic Search

The practice of using statistical analysis methods in applications is full of various problems whose statements are not described within the framework of classical assumptions. A wide range of statistical methods are based on the assumption of measurement error normality. Under real conditions normality and often some other assumptions are not satised. The use of classical methods of mathematical statistics

Boris Yu. Lemeshko; Stanislav B. Lemeshko; Ekaterina V. Chimitova; Sergey N. Postovalov

252

Practice characteristics and prior authorization costs: secondary analysis of data collected by SALT-Net in 9 central New York primary care practices  

PubMed Central

Background An increase in prior authorization (PA) requirements from health insurance companies is placing administrative and financial burdens on primary care offices across the United States. As time allocation for these cases continues to grow, physicians are concerned with additional workload and inefficiency in the workplace. The objective is to estimate the effects of practice characteristics on time spent per prior authorization request in primary care practices. Methods Secondary analysis was performed using data on nine primary care practices in Central New York. Practice characteristics and demographics were collected at the onset of the study. In addition, participants were instructed to complete an "event form" (EF) to document each prior authorization event during a 4–6 week period; prior authorizations included requests for medication as well as other health care services. Stepwise Ordinary Least Squares (OLS) Regression was used to model Time in Minutes of each event as an outcome of various factors. Results Prior authorization events (N?=?435) took roughly 20 minutes to complete (beta?=?20.017, p?

2014-01-01

253

Empirical Analysis of Green Supply Chain Management Practices in Indian Automobile Industry  

NASA Astrophysics Data System (ADS)

Environmental sustainability and green environmental issues have an increasing popularity among researchers and supply chain practitioners. An attempt has been made to identify and empirically analyze green supply chain management (GSCM) practices in Indian automobile industry. Six main GSCM practices (having 37 sub practices) and four expected performance outcomes (having 16 performances) have been identified by implementing GSCM practices from literature review. Questionnaire based survey has been made to validate these practices and performance outcomes. 123 complete questionnaires were collected from Indian automobile organizations and used for empirical analysis of GSCM practices in Indian automobile industry. Descriptive statistics have been used to know current implementation status of GSCM practices in Indian automobile industry and multiple regression analysis has been carried out to know the impact on expected organizational performance outcomes by current GSCM practices adopted by Indian automobile industry. The results of study suggested that environmental, economic, social and operational performances improve with the implementation of GSCM practices. This paper may play an important role to understand various GSCM implementation issues and help practicing managers to improve their performances in the supply chain.

Luthra, S.; Garg, D.; Haleem, A.

2014-04-01

254

Hybrid least squares multivariate spectral analysis methods  

DOEpatents

A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

Haaland, David M. (Albuquerque, NM)

2002-01-01

255

Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues  

SciTech Connect

This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

Ronald Laurids Boring

2010-11-01

256

Pharmacoeconomic analysis of prescriptions in Italian pediatric general practice  

Microsoft Academic Search

Most drugs used by children are prescribed by general pediatric practitioners (GPPs) in ambulatory settings. Prescription\\u000a profiles are affected by GPPs' attitudes while the cost is related to the reimbursement modality. This study evaluated the\\u000a Italian National Health Service (INHS) and family expenditures associated with prescribing practices to children younger than\\u000a 12 years. Forty-two GPPs from southern Italy participated in

R. Campi; L. Garattini; F. Tediosi; M. Bonati

2002-01-01

257

Escape analysis for JavaTM: Theory and practice  

Microsoft Academic Search

Escape analysis is a static analysis that determines whether the lifetime of data may exceed its static scope.This paper first presents the design and correctness proof of an escape analysis for Java. This analysis is interprocedural, context sensitive, and as flow-sensitive as the static single assignment form. So, assignments to object fields are analyzed in a flow-insensitive manner. Since Java

Bruno Blanchet

2003-01-01

258

Protein-protein interactions: methods for detection and analysis.  

PubMed Central

The function and activity of a protein are often modulated by other proteins with which it interacts. This review is intended as a practical guide to the analysis of such protein-protein interactions. We discuss biochemical methods such as protein affinity chromatography, affinity blotting, coimmunoprecipitation, and cross-linking; molecular biological methods such as protein probing, the two-hybrid system, and phage display: and genetic methods such as the isolation of extragenic suppressors, synthetic mutants, and unlinked noncomplementing mutants. We next describe how binding affinities can be evaluated by techniques including protein affinity chromatography, sedimentation, gel filtration, fluorescence methods, solid-phase sampling of equilibrium solutions, and surface plasmon resonance. Finally, three examples of well-characterized domains involved in multiple protein-protein interactions are examined. The emphasis of the discussion is on variations in the approaches, concerns in evaluating the results, and advantages and disadvantages of the techniques.

Phizicky, E M; Fields, S

1995-01-01

259

Problem-based learning and e-learning methods in clinical practice.  

PubMed

The purpose of this study is to describe the curriculum development and introduction of problem-based learning pedagogy (PBL) in the undergraduate nursing education at Savonia University of Applied Sciences in Iisalmi unit. The main points to be described are the integration of PBL and e-learning methods and nursing students' learning outcomes in clinical practice. PBL pedagogy develops information literacy skills, critical thinking and evidence-based nursing skills, communication, co-operation and team working skills, problem solving and self-assessment skills. Integration of PBL and e-learning methods in clinical practice has developed nursing competencies, reflection of learning and peer support of nursing students. The most important results are peer support, feedback and a teacher's encouragement when using e-learning methods. PMID:19592907

Jauhiainen, Annikki; Pulkkinen, Raija

2009-01-01

260

Searching Usenet for Virtual Communities of Practice: Using Mixed Methods to Identify the Constructs of Wenger's Theory  

ERIC Educational Resources Information Center

Introduction: This research set out to determine whether communities of practice can be entirely Internet-based by formally applying Wenger's theoretical framework to Internet collectives. Method: A model of a virtual community of practice was developed which included the constructs Wenger identified in co-located communities of practice: mutual…

Murillo, Enrique

2008-01-01

261

Comparison and Cost Analysis of Drinking Water Quality Monitoring Requirements versus Practice in Seven Developing Countries  

PubMed Central

Drinking water quality monitoring programs aim to support provision of safe drinking water by informing water quality management. Little evidence or guidance exists on best monitoring practices for low resource settings. Lack of financial, human, and technological resources reduce a country’s ability to monitor water supply. Monitoring activities were characterized in Cambodia, Colombia, India (three states), Jordan, Peru, South Africa, and Uganda according to water sector responsibilities, monitoring approaches, and marginal cost. The seven study countries were selected to represent a range of low resource settings. The focus was on monitoring of microbiological parameters, such as E. coli, coliforms, and H2S-producing microorganisms. Data collection involved qualitative and quantitative methods. Across seven study countries, few distinct approaches to monitoring were observed, and in all but one country all monitoring relied on fixed laboratories for sample analysis. Compliance with monitoring requirements was highest for operational monitoring of large water supplies in urban areas. Sample transport and labor for sample collection and analysis together constitute approximately 75% of marginal costs, which exclude capital costs. There is potential for substantive optimization of monitoring programs by considering field-based testing and by fundamentally reconsidering monitoring approaches for non-piped supplies. This is the first study to look quantitatively at water quality monitoring practices in multiple developing countries.

Crocker, Jonny; Bartram, Jamie

2014-01-01

262

Practical recommendations for statistical analysis and data presentation in Biochemia Medica journal  

PubMed Central

The aim of this article is to highlight practical recommendations based on our experience as reviewers and journal editors and refer to some most common mistakes in manuscripts submitted to Biochemia Medica. One of the most important parts of the article is the Abstract. Authors quite often forget that Abstract is sometimes the first (and only) part of the article read by the readers. The article Abstract must therefore be comprehensive and provide key results of your work. Problematic part of the article, also often neglected by authors is the subheading Statistical analysis, within Materials and methods, where authors must explain which statistical tests were used in their data analysis and the rationale for using those tests. They also need to make sure that all tests used are listed under Statistical analysis section, as well as that all tests listed are indeed used in the study. When writing Results section there are several key points to keep in mind, such as: are results presented with adequate precision and accurately; is descriptive analysis appropriate; is the measure of confidence provided for all estimates; if necessary and applicable, are correct statistical tests used for analysis; is P value provided for all tests, etc. Especially important is not to make any conclusions on the causal relationship unless the study is an experiment or clinical trial. We believe that the use of the proposed checklist might increase the quality of the submitted work and speed up the peer-review and publication process for published articles.

Simundic, Ana-Maria

2012-01-01

263

Intravaginal Practices, Bacterial Vaginosis, and HIV Infection in Women: Individual Participant Data Meta-analysis  

PubMed Central

Background Identifying modifiable factors that increase women's vulnerability to HIV is a critical step in developing effective female-initiated prevention interventions. The primary objective of this study was to pool individual participant data from prospective longitudinal studies to investigate the association between intravaginal practices and acquisition of HIV infection among women in sub-Saharan Africa. Secondary objectives were to investigate associations between intravaginal practices and disrupted vaginal flora; and between disrupted vaginal flora and HIV acquisition. Methods and Findings We conducted a meta-analysis of individual participant data from 13 prospective cohort studies involving 14,874 women, of whom 791 acquired HIV infection during 21,218 woman years of follow-up. Data were pooled using random-effects meta-analysis. The level of between-study heterogeneity was low in all analyses (I2 values 0.0%–16.1%). Intravaginal use of cloth or paper (pooled adjusted hazard ratio [aHR] 1.47, 95% confidence interval [CI] 1.18–1.83), insertion of products to dry or tighten the vagina (aHR 1.31, 95% CI 1.00–1.71), and intravaginal cleaning with soap (aHR 1.24, 95% CI 1.01–1.53) remained associated with HIV acquisition after controlling for age, marital status, and number of sex partners in the past 3 months. Intravaginal cleaning with soap was also associated with the development of intermediate vaginal flora and bacterial vaginosis in women with normal vaginal flora at baseline (pooled adjusted odds ratio [OR] 1.24, 95% CI 1.04–1.47). Use of cloth or paper was not associated with the development of disrupted vaginal flora. Intermediate vaginal flora and bacterial vaginosis were each associated with HIV acquisition in multivariable models when measured at baseline (aHR 1.54 and 1.69, p<0.001) or at the visit before the estimated date of HIV infection (aHR 1.41 and 1.53, p<0.001), respectively. Conclusions This study provides evidence to suggest that some intravaginal practices increase the risk of HIV acquisition but a direct causal pathway linking intravaginal cleaning with soap, disruption of vaginal flora, and HIV acquisition has not yet been demonstrated. More consistency in the definition and measurement of specific intravaginal practices is warranted so that the effects of specific intravaginal practices and products can be further elucidated. Please see later in the article for the Editors' Summary

Low, Nicola; Chersich, Matthew F.; Schmidlin, Kurt; Egger, Matthias; Francis, Suzanna C.; H. H. M. van de Wijgert, Janneke; Hayes, Richard J.; Baeten, Jared M.; Brown, Joelle; Delany-Moretlwe, Sinead; Kaul, Rupert; McGrath, Nuala; Morrison, Charles; Myer, Landon; Temmerman, Marleen; van der Straten, Ariane; Watson-Jones, Deborah; Zwahlen, Marcel; Martin Hilber, Adriane

2011-01-01

264

Method and apparatus for ceramic analysis  

DOEpatents

The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.

Jankowiak, Ryszard J. (Ames, IA); Schilling, Chris (Ames, IA); Small, Gerald J. (Ames, IA); Tomasik, Piotr (Cracow, PL)

2003-04-01

265

Drosophila hematopoiesis: Markers and methods for molecular genetic analysis.  

PubMed

Analyses of the Drosophila hematopoietic system are becoming more and more prevalent as developmental and functional parallels with vertebrate blood cells become more evident. Investigative work on the fly blood system has, out of necessity, led to the identification of new molecular markers for blood cell types and lineages and to the refinement of useful molecular genetic tools and analytical methods. This review briefly describes the Drosophila hematopoietic system at different developmental stages, summarizes the major useful cell markers and tools for each stage, and provides basic protocols for practical analysis of circulating blood cells and of the lymph gland, the larval hematopoietic organ. PMID:24613936

Evans, Cory J; Liu, Ting; Banerjee, Utpal

2014-06-15

266

Simplified Analysis Methods for Primary Load Designs at Elevated Temperatures  

SciTech Connect

The use of simplified (reference stress) analysis methods is discussed and illustrated for primary load high temperature design. Elastic methods are the basis of the ASME Section III, Subsection NH primary load design procedure. There are practical drawbacks with this approach, particularly for complex geometries and temperature gradients. The paper describes an approach which addresses these difficulties through the use of temperature-dependent elastic-perfectly plastic analysis. Correction factors are defined to address difficulties traditionally associated with discontinuity stresses, inelastic strain concentrations and multiaxiality. A procedure is identified to provide insight into how this approach could be implemented but clearly there is additional work to be done to define and clarify the procedural steps to bring it to the point where it could be adapted into code language.

Carter, Peter [Stress Engineering Services Inc.; Jetter, Robert I [Consultant; Sham, Sam [ORNL

2011-01-01

267

A practical method to determine the heating and cooling curves of x-ray tube assemblies  

SciTech Connect

A practical method to determine the heating and cooling curves of x-ray tube assemblies with rotating anode x-ray tube is proposed. Available procedures to obtain these curves as described in the literature are performed during operation of the equipment, and the precision of the method depends on the knowledge of the total energy applied in the system. In the present work we describe procedures which use a calorimetric system and do not require the operation of the x-ray equipment. The method was applied successfully to a x-ray tube assembly that was under test in our laboratory.

Bottaro, M.; Moralles, M.; Viana, V.; Donatiello, G. L.; Silva, E. P. [Instituto de Eletrotecnica e Energia da Universidade de Sao Paulo, Av. Prof. Luciano Gualberto, 1289, CEP 05508-010, Sao Paulo, SP (Brazil); Instituto de Pesquisas Energeticas e Nucleares, Av. Prof. Lineu Prestes, 2.242, CEP 05508-000 Sao Paulo, SP (Brazil); Instituto de Eletrotecnica e Energia da Universidade de Sao Paulo, Av. Prof. Luciano Gualberto, 1289, CEP 05508-010, Sao Paulo, SP (Brazil)

2007-10-15

268

Grid Analysis and Display System (GrADS): A practical tool for earth science visualization  

NASA Technical Reports Server (NTRS)

Viewgraphs on grid analysis and display system (GrADS): a practical tool for earth science visualization are presented. Topics covered include: GrADS design goals; data sets; and temperature profiles.

Kinter, James L., III; Doty, Brian E.

1991-01-01

269

An Improved ?-Analysis Method for Process Security Analysis  

Microsoft Academic Search

rocess security is a newly pronounced issue facing the chemical process industry in the post 11 September era. Traditional safety is no longer sufficient for a chemical plant; it must also be secure. However, systematic and effective quantitative methodologies for process security analysis are necessary. To address this issue, the g-analysis method was introduced very recently by Uygun et al.

K. Uygun; Y. Huang; H. H. Lou

2006-01-01

270

Relationship between the homotopy analysis method and harmonic balance method  

NASA Astrophysics Data System (ADS)

This paper presents a study of the relationship between the homotopy analysis method (HAM) and harmonic balance (HB) method. The HAM is employed to obtain periodic solutions of conservative oscillators and limit cycles of self-excited systems, respectively. Different from the usual procedures in the existing literature, the HAM is modified by retaining a given number of harmonics in higher-order approximations. It is proved that as long as the solution given by the modified HAM is convergent, it converges to one HB solution. The Duffing equation, the van der Pol equation and the flutter equation of a two-dimensional airfoil are taken as illustrations to validate the attained results.

Chen, Y. M.; Liu, J. K.; Meng, G.

2010-08-01

271

Practice of Physical Activity among Future Doctors: A Cross Sectional Analysis  

PubMed Central

Background: Non communicable diseases (NCD) will account for 73% of deaths and 60% of the global disease burden by 2020. Physical activity plays a major role in the prevention of these non-communicable diseases. The stress involved in meeting responsibilities of becoming a physician may adversely affect the exercise habits of students. So, the current study aimed to study the practice of physical activity among undergraduate medical students. Methods: A cross sectional study was conducted among 240 undergraduate medical students. Quota sampling method was used to identify 60 students from each of the four even semesters. A pre-tested, semi-structured questionnaire was used to collect the data. Statistical Package for Social Sciences (SPSS) version 16 was used for data entry and analysis and results are expressed as percentages and proportions. Results: In our study, 55% were 20 to 22 years old. Over half of the students were utilizing the sports facilities provided by the university in the campus. Majority of students 165 (69%) had normal body mass index (BMI), (51) 21% were overweight, while 7 (3%) were obese. Of the 62% who were currently exercising, the practice of physical activity was more among boys as compared to girls (62% v/s 38%). Lack of time 46 (60.5%), laziness (61.8%), and exhaustion from academic activities (42%) were identified as important hindering factors among medical students who did not exercise. Conclusion: A longitudinal study to follow-up student behavior throughout their academic life is needed to identify the factors promoting the practice of physical activity among students.

Rao, Chythra R; Darshan, BB; Das, Nairita; Rajan, Vinaya; Bhogun, Meemansha; Gupta, Aditya

2012-01-01

272

Breastfeeding practices in a public health field practice area in Sri Lanka: a survival analysis  

Microsoft Academic Search

BACKGROUND: Exclusive breastfeeding up to the completion of the sixth month of age is the national infant feeding recommendation for Sri Lanka. The objective of the present study was to collect data on exclusive breastfeeding up to six months and to describe the association between exclusive breastfeeding and selected socio-demographic factors. METHODS: A clinic based cross-sectional study was conducted in

Suneth B Agampodi; Thilini C Agampodi; Udage Kankanamge D Piyaseeli

2007-01-01

273

78 FR 24691 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...  

Federal Register 2010, 2011, 2012, 2013

...Manufacturing Practice and Hazard Analysis and Risk-Based Preventive...Manufacturing Practice and Hazard Analysis and Risk-Based Preventive...sufficient time to develop a meaningful or thoughtful response to...Manufacturing Practice and Hazard Analysis and Risk-Based...

2013-04-26

274

Spectroscopic chemical analysis methods and apparatus  

NASA Technical Reports Server (NTRS)

Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

Hug, William F. (Inventor); Reid, Ray D. (Inventor)

2010-01-01

275

Spectroscopic chemical analysis methods and apparatus  

NASA Technical Reports Server (NTRS)

Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

Hug, William F. (Inventor); Reid, Ray D. (Inventor)

2009-01-01

276

Practice differences between male and female oral and maxillofacial surgeons: Survey results and analysis  

Microsoft Academic Search

Purpose: This study describes the personal and practice characteristics of oral and maxillofacial surgeons, with an emphasis on gender differences. Potential explanations for differences found are offered.Materials and Methods: A 39-item questionnaire was designed to address areas of suspected differences between male and female oral and maxillofacial surgeons. It included items regarding training, certification, practice type and location, time spent

Amy J Bogardus; Barbara R Neas; Steven M Sullivan

1999-01-01

277

The evolution of nursing in Australian general practice: a comparative analysis of workforce surveys ten years on  

PubMed Central

Background Nursing in Australian general practice has grown rapidly over the last decade in response to government initiatives to strengthen primary care. There are limited data about how this expansion has impacted on the nursing role, scope of practice and workforce characteristics. This study aimed to describe the current demographic and employment characteristics of Australian nurses working in general practice and explore trends in their role over time. Methods In the nascence of the expansion of the role of nurses in Australian general practice (2003–2004) a national survey was undertaken to describe nurse demographics, clinical roles and competencies. This survey was repeated in 2009–2010 and comparative analysis of the datasets undertaken to explore workforce changes over time. Results Two hundred eighty four nurses employed in general practice completed the first survey (2003/04) and 235 completed the second survey (2009/10). Significantly more participants in Study 2 were undertaking follow-up of pathology results, physical assessment and disease specific health education. There was also a statistically significant increase in the participants who felt that further education/training would augment their confidence in all clinical tasks (p?practice decreased between the two time points, more participants perceived lack of space, job descriptions, confidence to negotiate with general practitioners and personal desire to enhance their role as barriers. Access to education and training as a facilitator to nursing role expansion increased between the two studies. The level of optimism of participants for the future of the nurses’ role in general practice was slightly decreased over time. Conclusions This study has identified that some of the structural barriers to nursing in Australian general practice have been addressed over time. However, it also identifies continuing barriers that impact practice nurse role development. Understanding and addressing these issues is vital to optimise the effectiveness of the primary care nursing workforce.

2014-01-01

278

Results from three years of the world's largest interlaboratory comparison for total mercury and methylmercury: Method performance and best practices  

NASA Astrophysics Data System (ADS)

Brooks Rand Instruments has conducted the world's largest interlaboratory comparison study for total mercury and methylmercury in natural waters annually for three years. Each year, roughly 50 laboratories registered to participate and the majority of participants submitted results. Each laboratory was assigned a performance score based on the distance between its results and the consensus mean, as well as the precision of its replicate analyses. Participants were also asked to provide detailed data on their analytical methodology and equipment. We used the methodology data and performance scores to assess the performance of the various methods reported and equipment used. Although the majority of methods in use show no systematic trend toward poor analytical performance, there are noteworthy exceptions. We present results from each of the three years of the interlaboratory comparison exercise, as well as aggregated method performance data. We compare the methods used in this study to methods from other published interlaboratory comparison studies and present a list of recommended best practices. Our goals in creating a list of best practices are to maximize participation, ensure inclusiveness, minimize non-response bias, guarantee high data quality, and promote transparency of analysis. We seek to create a standardized methodology for interlaboratory comparison exercises for total mercury and methylmercury analysis in water, which will lead to more directly comparable results between studies. We show that in most cases, the coefficient of variation between labs measuring replicates of the same sample is greater than 20% after the removal of outlying data points (e.g. Figure 1). It is difficult to make comparisons between studies and ecosystems with such a high variability between labs. We highlight the need for regular participation in interlaboratory comparison studies and continuous analytical method improvement in order to ensure accurate data. Figure 1. Results from one sample analyzed in the 2013 Interlaboratory Comparison Study.

Creswell, J. E.; Engel, V.; Carter, A.; Davies, C.

2013-12-01

279

Advanced pneumatic method for gradient flow analysis  

NASA Astrophysics Data System (ADS)

Pneumatic 5-hole probes are widely known reliable sensors for the analysis of three-dimensional flow fields. Since the accuracy of such measurements depends strongly on the volume of the probe and the gradients in the flow, a miniature spherical five-hole-probe with an improved analysis method was developed. With the new method, the complete physically reasonable angle measurement range can be used now by introducing modified calibration functions. A dimensionless examination of the flow around spheres shows the independence of the calibration functions within a wide range of flow velocities. Misrepresentations in flows with high gradients caused by the volume of the probe are estimated by a geometry based correction method. The quality of the method is analyzed by an extensive error calculation. Results of measurements in a three-dimensional model combustor are discussed.

Glahn, A.; Hallmann, M.; Jeckel, R.; Wittig, S.

1993-08-01

280

Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2  

NASA Technical Reports Server (NTRS)

The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

Johnson, Kenneth L.; White, K. Preston, Jr.

2012-01-01

281

Sums and densities of fully coupled anharmonic vibrational states: a comparison of three practical methods.  

PubMed

Three practical methods for computing sums and densities of states of fully coupled anharmonic vibrations are compared. All three methods are based on the standard perturbation theory expansion for the vibrational energy. The accuracy of the perturbation theory expansion is tested by comparisons with computed eigenvalues and/or experimental vibrational constants taken from the literature for three- and four-atom molecules. For a number of examples, it is shown that the X(ij) terms in the perturbation theory expansion account for most of the anharmonicity, and the Y(ijk) terms also make a small contribution; contributions from the Z(ijkl) terms are insignificant. For molecules containing up to approximately 4 atoms, the sums and densities of states can be computed by using nested DO-loops, but this method becomes impractical for larger species. An efficient Monte Carlo method published previously is both accurate and practical for molecules containing 3-6 atoms but becomes too slow for larger species. The Wang-Landau algorithm is shown to be practical and reasonably accurate for molecules containing approximately 4 or more atoms, where the practical size limit (with a single computer processor) is currently on the order of perhaps 50 atoms. It is shown that the errors depend mostly on the average number of stochastic samples per energy bin. An automated version of the Wang-Landau algorithm is described. Also described are the effects of Fermi resonances and procedures for deperturbation of the anharmonicity coefficients. Computer codes based on all three algorithms are available from the authors and can also be downloaded freely from the Internet (http://aoss.engin.umich.edu/multiwell/). PMID:20170143

Nguyen, Thanh Lam; Barker, John R

2010-03-18

282

Probabilistic structural analysis methods development for SSME  

NASA Technical Reports Server (NTRS)

The development of probabilistic structural analysis methods is a major part of the SSME Structural Durability Program and consists of three program elements: composite load spectra, probabilistic finite element structural analysis, and probabilistic structural analysis applications. Recent progress includes: (1) the effects of the uncertainties of several factors on the HPFP blade temperature pressure and torque, (2) the evaluation of the cumulative distribution function of structural response variables based on assumed uncertainties on primitive structural variables, and (3) evaluation of the failure probability. Collectively, the results obtained demonstrate that the structural durability of critical SSME components can be probabilistically evaluated.

Chamis, C. C.; Hopkins, D. A.

1988-01-01

283

Design analysis, robust methods, and stress classification  

SciTech Connect

This special edition publication volume is comprised of papers presented at the 1993 ASME Pressure Vessels and Piping Conference, July 25--29, 1993 in Denver, Colorado. The papers were prepared for presentations in technical sessions developed under the auspices of the PVPD Committees on Computer Technology, Design and Analysis, Operations Applications and Components. The topics included are: Analysis of Pressure Vessels and Components; Expansion Joints; Robust Methods; Stress Classification; and Non-Linear Analysis. Individual papers have been processed separately for inclusion in the appropriate data bases.

Bees, W.J. (ed.)

1993-01-01

284

Markov Chain Monte Carlo Linkage Analysis Methods  

Microsoft Academic Search

As alluded to in the chapter “Linkage Analysis of Qualitative Traits”, neither the Elston–Steward algorithm nor the Lander–Green\\u000a approach is amenable to genetic data from large complex pedigrees and a large number of markers. In such cases, Monte Carlo\\u000a estimation methods provide a viable alternative to the exact solutions. Two types of Monte Carlo methods have been developed\\u000a for linkage

Robert P. Igo; Yuqun Luo; Shili Lin

285

Iterative methods for design sensitivity analysis  

NASA Technical Reports Server (NTRS)

A numerical method is presented for design sensitivity analysis, using an iterative-method reanalysis of the structure generated by a small perturbation in the design variable; a forward-difference scheme is then employed to obtain the approximate sensitivity. Algorithms are developed for displacement and stress sensitivity, as well as for eignevalues and eigenvector sensitivity, and the iterative schemes are modified so that the coefficient matrices are constant and therefore decomposed only once.

Belegundu, A. D.; Yoon, B. G.

1989-01-01

286

An analysis of the light changes of eclipsing variables in the frequency-domain - Practical aspects  

NASA Astrophysics Data System (ADS)

Practical aspects of the analysis of the light changes of eclipsing binary systems in the frequency domain are reviewed, and the advantages of this process over the time-domain approach are pointed out. A direct solution of the problem for the case of total eclipses is given which, in the frequency domain, can be completely algebraized, requiring no tables of any special functions. A generalization of this process to any type of eclipses is given, and ways are shown to deduce the uncertainty of the elements of the eclipses from that of the moments A(2m) of the eclipses. Methods to extend these techniques to eclipsing systems whose light changes are not limited to the times of minima alone are given, and physical processes which are likely to produce the light-curve asymmetries noted in many close binary systems are considered.

Kopal, Zdenek

287

Aging, Practice, and Perceptual Tasks: A Diffusion Model Analysis  

PubMed Central

Practice effects were examined in a masked letter discrimination task and a masked brightness discrimination task for college-age and 60- to 75-year-old subjects. The diffusion model (Ratcliff, 1978) was fit to the response time and accuracy data and used to extract estimates of components of processing from the data. Relative to young subjects, the older subjects began the experiments with slower and less accurate performance; however, across sessions their accuracy improved because the quality of the information on which their decisions were based improved, and this, along with reduced decision criteria, led to shorter response times. For the brightness, but not the letter, discrimination task, the older subjects' performance matched that of the younger group by the end of 4 sessions, except that their nondecision components of processing were slightly slower. These analyses illustrate how a well-specified model can provide a unified view of multiple aspects of data that are often interpreted separately.

Ratcliff, Roger; Thapar, Anjali; McKoon, Gail

2008-01-01

288

Accessing the ethics of complex health care practices: would a "domains of ethics analysis" approach help?  

PubMed

This paper explores how using a "domains of ethics analysis" approach might constructively contribute to an enhanced understanding (among those without specialized ethics training) of ethically-complex health care practices through the consideration of one such sample practice, i.e., deep and continuous palliative sedation (DCPS). For this purpose, I select four sample ethics domains (from a variety of possible relevant domains) for use in the consideration of this practice, i.e., autonomous choice, motives, actions and consequences. These particular domains were chosen because of their relevance to the analysis of DCPS and their relative ease of access to those without ethics training. The analysis demonstrates that such an approach could facilitate the emergence of accessible arguments and discussion points that could enhance the understanding and appreciation of this and other health care practices with strong ethics dimensions. PMID:20505981

Kirby, Jeffrey

2010-06-01

289

Segmentation and analysis of console operation using self-organizing map with cluster growing method  

Microsoft Academic Search

For manipulation of remote mobile robots, adequate scheduling of tasks and selecting of operational commands are required. This paper presents an analysis procedure to make the task switching profile visible by utilizing the Self-Organizing Map (SOM) and new cluster growing method. For practical verification, an experiment system with radio-controlled construction equipments was built, and the proposed analysis procedure was applied

Satoshi Suzuki; Fumio Harashima

2009-01-01

290

Practical use of three-dimensional inverse method for compressor blade design  

SciTech Connect

The practical utility of a three-dimensional inverse viscous method is demonstrated by carrying out a design modification of a first-stage rotor in an industrial compressor. In this design modification study, the goal is to improve the efficiency of the original blade while retaining its overall aerodynamic, structural, and manufacturing characteristics. By employing a simple modification to the blade pressure loading distribution (which is the prescribed flow quantity in this inverse method), the modified blade geometry is predicted to perform better than the original design over a wide range of operating points, including an improvement in choke margin.

Damle, S.; Dang, T. [Syracuse Univ., NY (United States). Dept. of Mechanical, Aerospace and Mfg. Engineering; Stringham, J.; Razinsky, E. [Solar Turbines, Inc., San Diego, CA (United States)

1999-04-01

291

A Mathematical Analysis of the PML Method  

Microsoft Academic Search

A detailed mathematical analysis of the Berenger PML method for the electromagnetic equations is carried out on the PDE level, as well as for the semidiscrete and fully discrete formulations. It is shown that the split set of equations is not strongly well-posed and that under certain conditions its solutions may be inappropriate.

Saul Abarbanel; David Gottlieb

1997-01-01

292

Expression Data Analysis Systems and Methods.  

National Technical Information Service (NTIS)

Systems and methods for performing rapid genomic DNA analysis of samples, such as control samples and experimental samples. In one aspect, the system makes use of genomic DNA input, rather than gene expression input such as mRNA and/or cDNA associated wit...

D. Roopenian D. J. Shaffer K. D. Mills S. Akilesh

2006-01-01

293

Methods for Chemical Analysis of Fresh Waters.  

ERIC Educational Resources Information Center

This manual, one of a series prepared for the guidance of research workers conducting studies as part of the International Biological Programme, contains recommended methods for the analysis of fresh water. The techniques are grouped in the following major sections: Sample Taking and Storage; Conductivity, pH, Oxidation-Reduction Potential,…

Golterman, H. L.

294

A Method for Automating Dialect Analysis.  

ERIC Educational Resources Information Center

This paper proposes a method of handling limited problems in dialect research. In approaching the problem, it was necessary to devise a system for coding phonetic transcription which would take into account the variance in the diacritics of different field workers so that none of the material would be lost while permitting computer analysis. The…

Uskup, Frances Land

295

Integrated method for chaotic time series analysis  

DOEpatents

Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

Hively, L.M.; Ng, E.G.

1998-09-29

296

Principles of Good Practice for Budget Impact Analysis: Report of the ISPORTask Force on Good Research Practices— Budget Impact Analysis  

Microsoft Academic Search

Objectives: There is growing recognition that a comprehen- sive economic assessment of a new health-care intervention at the time of launch requires both a cost-effectiveness analysis (CEA) and a budget impact analysis (BIA). National regula- tory agencies such as the National Institute for Health and Clinical Excellence in England and Wales and the Pharma- ceutical Benefits Advisory Committee in Australia,

Josephine A. Mauskopf; Sean D. Sullivan; Lieven Annemans; Jaime Caro; C. Daniel Mullins; Mark Nuijten; Ewa Orlewska; John Watkins; Paul Trueman

2007-01-01

297

Multiple predictor smoothing methods for sensitivity analysis.  

SciTech Connect

The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

Helton, Jon Craig; Storlie, Curtis B.

2006-08-01

298

A modified method for calculating practical ethanol yield at high lignocellulosic solids content and high ethanol titer.  

PubMed

A modified method for calculating practical ethanol yield in the simultaneous saccharification and fermentation (SSF) at high lignocellulosic solids content and high ethanol titer is proposed considering the liquid volume change caused by high titer ethanol generation and the water consumed during cellulose degradation. This modified method was applied to determine the practical ethanol yields of several practical SSF operations and the results compared to those using the conventional method. The results show that the liquid volume increase with ethanol formation during SSF was approximately five times greater than the volume decrease duo to water consumption during cellulose degradation. Furthermore, the practical ethanol yields calculating using traditional method were underestimated and the underestimated errors increased with the increasing ethanol titer. The present work may provide a convenient and accurate method for calculating practical ethanol yield in a high solids and high ethanol titer SSF systems. PMID:22609658

Zhang, Jian; Bao, Jie

2012-07-01

299

Current Practice of Heart Donor Evaluation in Germany: Multivariable Risk Factor Analysis Confirms Practicability of Guidelines  

PubMed Central

Background. Organ shortage has liberalised the acceptance criteria of grafts for heart transplantation, but which donor characteristics ultimately influence the decision to perform transplantation? For the first time this was evaluated using real-time donor data from the German organ procurement organization (DSO). Observed associations are discussed with regard to international recommendations and guidelines. Methods. 5291 German donors (2006–2010) were formally eligible for heart donation. In logistic regression models 160 donor parameters were evaluated to assess their influence on using grafts for transplantation (random split of cases: 2/3 study sample, 1/3 validation sample). Results. Successful procurement was determined by low donor age (OR 0.87 per year; 95% CI [0.85–0.89], P < 0.0001), large donor height (OR 1.04 per cm; 95% CI [1.02–1.06], P < 0.0001), exclusion of impaired left ventricular function or wall motion (OR 0.01; 95% CI [0.002–0.036], P < 0.0001), arrhythmia (OR 0.05; 95% CI [0.009–0.260], P = 0.0004), and of severe coronary artery disease (OR 0.003; 95% CI [<0.001–0.01], P < 0.0001). Donor characteristics differed between cases where the procedure was aborted without and with allocation initiated via Eurotransplant.

Fischer-Froehlich, Carl-Ludwig; Polster, Frank; Fruehauf, Nils R.; Kirste, Guenter

2013-01-01

300

Deriving a practical analytical-probabilistic method to size flood routing reservoirs  

NASA Astrophysics Data System (ADS)

In the engineering practice routing reservoir sizing is commonly performed by using the design storm method, although its effectiveness has been debated for a long time. Conversely, continuous simulations and direct statistical analyses of recorded hydrographs are considered more reliable and comprehensive, but are indeed complex or seldom practicable. In this paper a handier tool is provided by the analytical-probabilistic approach to construct probability functions of peak discharges issuing from natural watersheds or routed through on-line and off-line reservoirs. A simplified routing scheme and a rainfall-runoff model based on a few essential hydrological parameters were implemented. To validate the proposed design methodology, on-line and off-line routing reservoirs were firstly sized by means of a conventional design storm method for a test watershed located in northern Italy. Their routing efficiencies were then estimated by both analytical-probabilistic models and benchmarking continuous simulations. Bearing in mind practical design purposes, adopted models evidenced a satisfactory consistency.

Balistrocchi, Matteo; Grossi, Giovanna; Bacchi, Baldassare

2013-12-01

301

New Tetrapolar Method for Complex Bioimpedance Measurement: Theoretical Analysis and Circuit Realization  

Microsoft Academic Search

This paper describes the theory and practical implementation of a new tetrapolar (four-electrode) method for measuring complex bioimpedance of local tissue. The new tetrapolar method adopts three independent voltages VIN+, VIN-, VR respectively from the two voltage electrodes and the sample resistor R, which is connected in serial with one of the two current electrodes. Theoretical analysis shows that the

Yuxiang Yang; Jue Wang

2005-01-01

302

The Analysis of Athletic Performance: Some Practical and Philosophical Considerations  

ERIC Educational Resources Information Center

This article presents a hypothetical dialogue between a notational analyst (NA) recently schooled in the positivistic assessment of athletic performance, an "old-school" traditional coach (TC) who favours subjective analysis, and a pragmatic educator (PE). The conversation opens with NA and TC debating the respective value of quantitative and…

Nelson, Lee J.; Groom, Ryan

2012-01-01

303

Practical identifiability analysis of large environmental simulation models  

Microsoft Academic Search

Large environmental simulation models are usually overparameterized with respect to given sets of observations. This results in poorly identifiable or nonidentifiable model parameters. For small models, plots of sensitivity functions have proven to be useful for the analysis of parameter identifiability. For models with many parameters, however, near-linear dependence of sensitivity functions can no longer be assessed graphically. In this

Roland Brun; Peter Reichert; Hans R. Künsch

2001-01-01

304

Suspension, Race, and Disability: Analysis of Statewide Practices and Reporting  

ERIC Educational Resources Information Center

This analysis of statewide suspension data from 1995 to 2003 in Maryland investigated disproportionate suspensions of minority students and students with disabilities. We found substantial increases in over-all rates of suspensions from 1995 to 2003, as well as disproportionate rates of suspensions for African American students, American Indian…

Krezmien, Michael P.; Leone, Peter E.; Achilles, Georgianna M.

2006-01-01

305

Digital Data Collection and Analysis: Application for Clinical Practice  

ERIC Educational Resources Information Center

Technology for digital speech recording and speech analysis is now readily available for all clinicians who use a computer. This article discusses some advantages of moving from analog to digital recordings and outlines basic recording procedures. The purpose of this article is to familiarize speech-language pathologists with computerized audio…

Ingram, Kelly; Bunta, Ferenc; Ingram, David

2004-01-01

306

A framework for service systems analysis: theory and practice  

Microsoft Academic Search

Modelling complexities associated with service systems remains a challenge for most organisations. This task is particularly made difficult as a result of markets volatility, changing customer needs and uncertainties associated with synthesising knowledge from various sources. This article, building on systems analysis and theory, presents a framework that can be used by managers and systems modellers in service organisations to

Kiran Jude Fernandes

2012-01-01

307

A framework for service systems analysis: theory and practice  

Microsoft Academic Search

Modelling complexities associated with service systems remains a challenge for most organisations. This task is particularly made difficult as a result of markets volatility, changing customer needs and uncertainties associated with synthesising knowledge from various sources. This article, building on systems analysis and theory, presents a framework that can be used by managers and systems modellers in service organisations to

Kiran Jude Fernandes

2011-01-01

308

A practical method to avoid zero-point leak in molecular dynamics calculations: Application to the water dimer  

NASA Astrophysics Data System (ADS)

We report the implementation of a previously suggested method to constrain a molecular system to have mode-specific vibrational energy greater than or equal to the zero-point energy in quasiclassical trajectory calculations [J. M. Bowman et al., J. Chem. Phys. 91, 2859 (1989); W. H. Miller et al., J. Chem. Phys. 91, 2863 (1989)]. The implementation is made practical by using a technique described recently [G. Czakó and J. M. Bowman, J. Chem. Phys. 131, 244302 (2009)], where a normal-mode analysis is performed during the course of a trajectory and which gives only real-valued frequencies. The method is applied to the water dimer, where its effectiveness is shown by computing mode energies as a function of integration time. Radial distribution functions are also calculated using constrained quasiclassical and standard classical molecular dynamics at low temperature and at 300 K and compared to rigorous quantum path integral calculations.

Czakó, Gábor; Kaledin, Alexey L.; Bowman, Joel M.

2010-04-01

309

Image, measure, figure: a critical discourse analysis of nursing practices that develop children.  

PubMed

Motivated by discourses that link early child development and health, nurses engage in seemingly benign surveillance of children. These practices are based on knowledge claims and technologies of developmental science, which remain anchored in assumptions of the child body as an incomplete form with a universal developmental trajectory and inherent potentiality. This paper engages in a critical discursive analysis, drawing on Donna Haraway's conceptualizations of technoscience and figuration. Using a contemporary developmental screening tool from nursing practice, this analysis traces the effects of this tool through production, transformation, distribution, and consumption. It reveals how the techniques of imaging, abstraction, and measurement collide to fix the open, transformative child body in a figuration of the developing child. This analysis also demonstrates how technobiopower infuses nurses' understandings of children and structures developmentally appropriate expectations for children, parents, and nurses. Furthermore, it describes how practices that claim to facilitate healthy child development may inversely deprive children of agency and foster the production of normal or ideal children. An alternative ontological perspective is offered as a challenge to the individualism of developmental models and other dominant ideologies of development, as well as practices associated with these ideologies. In summary, this analysis argues that nurses must pay closer attention to how technobiopower infuses practices that monitor and promote child development. Fostering a critical understanding of the harmful implications of these practices is warranted and offers the space to conceive of human development in alternate and exciting ways. PMID:23745662

Einboden, Rochelle; Rudge, Trudy; Varcoe, Colleen

2013-07-01

310

A Fourier method for the analysis of exponential decay curves.  

PubMed Central

A method based on the Fourier convolution theorem is developed for the analysis of data composed of random noise, plus an unknown constant "base line," plus a sum of (or an integral over a continuous spectrum of) exponential decay functions. The Fourier method's usual serious practical limitation of needing high accuracy data over a very wide range is eliminated by the introduction of convergence parameters and a Gaussian taper window. A computer program is described for the analysis of discrete spectra, where the data involves only a sum of exponentials. The program is completely automatic in that the only necessary inputs are the raw data (not necessarily in equal intervals of time); no potentially biased initial guesses concerning either the number or the values of the components are needed. The outputs include the number of components, the amplitudes and time constants together with their estimated errors, and a spectral plot of the solution. The limiting resolving power of the method is studied by analyzing a wide range of simulated two-, three-, and four-component data. The results seem to indicate that the method is applicable over a considerably wider range of conditions than nonlinear least squares or the method of moments.

Provencher, S W

1976-01-01

311

Practical guidance for statistical analysis of operational event data  

SciTech Connect

This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

Atwood, C.L.

1995-10-01

312

Text analysis devices, articles of manufacture, and text analysis methods  

SciTech Connect

Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.

Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C

2013-05-28

313

Using Mixed-Methods Sequential Explanatory Design: From Theory to Practice  

Microsoft Academic Search

This article discusses some procedural issues related to the mixed-methods sequential explanatory design, which implies collecting and analyzing quantitative and then qualitative data in two consecutive phases within one study. Such issues include deciding on the priority or weight given to the quantitative and qualitative data collection and analysis in the study, the sequence of the data collection and analysis,

Nataliya V. Ivankova; John W. Creswell; Sheldon L. Stick

2006-01-01

314

ANALYSIS OF PRACTICAL GROUND CONTROL ISSUES IN HIGHWALL MINING  

Microsoft Academic Search

Highwall mining is an important coal mining method. It appears that upwards of 60 highwall miners are presently in operation, and they may account for approximately 4% of total U.S. coal production. A review of the Mines Safety and Health Administration (MSHA) data over the 20 year period from 1983 to 2002 identified 9 fatalities attributable to auger and highwall

R. Karl Zipf

315

Analysis of Practical Ground Control Issues in Highwall Mining.  

National Technical Information Service (NTIS)

Highwall mining is an important coal mining method. It appears that upwards of 60 highwall miners are presently in operation, and they may account for approximately 4% of total U.S. coal production. A review of the Mines Safety and Health Administration (...

R. K. Zipf S. Bhatt

2008-01-01

316

Structural sensitivity analysis: Methods, applications and needs  

NASA Technical Reports Server (NTRS)

Innovative techniques applicable to sensitivity analysis of discretized structural systems are reviewed. The techniques include a finite difference step size selection algorithm, a method for derivatives of iterative solutions, a Green's function technique for derivatives of transient response, simultaneous calculation of temperatures and their derivatives, derivatives with respect to shape, and derivatives of optimum designs with respect to problem parameters. Computerized implementations of sensitivity analysis and applications of sensitivity derivatives are also discussed. Some of the critical needs in the structural sensitivity area are indicated along with plans for dealing with some of those needs.

Adelman, H. M.; Haftka, R. T.; Camarda, C. J.; Walsh, J. L.

1984-01-01

317

Structural sensitivity analysis: Methods, applications, and needs  

NASA Technical Reports Server (NTRS)

Some innovative techniques applicable to sensitivity analysis of discretized structural systems are reviewed. These techniques include a finite-difference step-size selection algorithm, a method for derivatives of iterative solutions, a Green's function technique for derivatives of transient response, a simultaneous calculation of temperatures and their derivatives, derivatives with respect to shape, and derivatives of optimum designs with respect to problem parameters. Computerized implementations of sensitivity analysis and applications of sensitivity derivatives are also discussed. Finally, some of the critical needs in the structural sensitivity area are indicated along with Langley plans for dealing with some of these needs.

Adelman, H. M.; Haftka, R. T.; Camarda, C. J.; Walsh, J. L.

1984-01-01

318

Advanced Analysis Methods in High Energy Physics  

SciTech Connect

During the coming decade, high energy physics experiments at the Fermilab Tevatron and around the globe will use very sophisticated equipment to record unprecedented amounts of data in the hope of making major discoveries that may unravel some of Nature's deepest mysteries. The discovery of the Higgs boson and signals of new physics may be around the corner. The use of advanced analysis techniques will be crucial in achieving these goals. The author discusses some of the novel methods of analysis that could prove to be particularly valuable for finding evidence of any new physics, for improving precision measurements and for exploring parameter spaces of theoretical models.

Pushpalatha C. Bhat

2001-10-03

319

Spectroscopic Chemical Analysis Methods and Apparatus  

NASA Technical Reports Server (NTRS)

This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses ballistic electron beam injection directly into the active region of a wide bandgap semiconductor material.

Hug, William F.; Reid, Ray D.

2012-01-01

320

The influence of deliberate practice on musical achievement: a meta-analysis  

PubMed Central

Deliberate practice (DP) is a task-specific structured training activity that plays a key role in understanding skill acquisition and explaining individual differences in expert performance. Relevant activities that qualify as DP have to be identified in every domain. For example, for training in classical music, solitary practice is a typical training activity during skill acquisition. To date, no meta-analysis on the quantifiable effect size of deliberate practice on attained performance in music has been conducted. Yet the identification of a quantifiable effect size could be relevant for the current discussion on the role of various factors on individual difference in musical achievement. Furthermore, a research synthesis might enable new computational approaches to musical development. Here we present the first meta-analysis on the role of deliberate practice in the domain of musical performance. A final sample size of 13 studies (total N = 788) was carefully extracted to satisfy the following criteria: reported durations of task-specific accumulated practice as predictor variables and objectively assessed musical achievement as the target variable. We identified an aggregated effect size of rc = 0.61; 95% CI [0.54, 0.67] for the relationship between task-relevant practice (which by definition includes DP) and musical achievement. Our results corroborate the central role of long-term (deliberate) practice for explaining expert performance in music.

Platz, Friedrich; Kopiez, Reinhard; Lehmann, Andreas C.; Wolf, Anna

2014-01-01

321

Measurement methods for human exposure analysis.  

PubMed Central

The general methods used to complete measurements of human exposures are identified and illustrations are provided for the cases of indirect and direct methods used for exposure analysis. The application of the techniques for external measurements of exposure, microenvironmental and personal monitors, are placed in the context of the need to test hypotheses concerning the biological effects of concern. The linkage of external measurements to measurements made in biological fluids is explored for a suite of contaminants. This information is placed in the context of the scientific framework used to conduct exposure assessment. Examples are taken from research on volatile organics and for a large scale problem: hazardous waste sites.

Lioy, P J

1995-01-01

322

Forum discussion on probabilistic structural analysis methods  

SciTech Connect

The use of Probabilistic Structural Analysis Methods (PSAM) has received much attention over the past several decades due in part to enhanced reliability theories, computational capabilities, and efficient algorithms. The need for this development was already present and waiting at the door step. Automotive design and manufacturing has been greatly enhanced because of PSAM and reliability methods, including reliability-based optimization. This demand was also present in the US Department of Energy (DOE) weapons laboratories in support of the overarching national security responsibility of maintaining the nations nuclear stockpile in a safe and reliable state.

Rodriguez, E.A.; Girrens, S.P.

2000-10-01

323

Homotopy analysis method applied to electrohydrodynamic flow  

NASA Astrophysics Data System (ADS)

In this paper, we consider the nonlinear boundary value problem (BVP) for the electrohydrodynamic flow of a fluid in an ion drag configuration in a circular cylindrical conduit. We present analytical solutions based on the homotopy analysis method (HAM) for various values of the relevant parameters and discuss the convergence of these solutions. We also compare our results with numerical solutions. The results provide another example of a highly nonlinear problem in which HAM is the only known analytical method that yields convergent solutions for all values of the relevant parameters.

Mastroberardino, Antonio

2011-07-01

324

Genes, patients, families, doctors-mutation analysis in clinical practice.  

PubMed

Developments in mutation analysis have led to significant benefits for patients with inherited metabolic disorders and their families. This is particularly the case where new methodologies have prevented the need for invasive tissue biopsies or have allowed carrier detection or first trimester prenatal testing to be undertaken. Whereas in the past it may have only been possible to identify specific 'common' mutations, the availability of techniques, such as automated sequencing, and novel technologies including mutation scanning techniques, multiplex ligation dependent probe amplification, and array technologies, have vastly improved the diagnostic efficiency of molecular testing. PMID:19306072

Walter, J H

2009-06-01

325

Design and Analysis of a Practical E-Voting Protocol  

NASA Astrophysics Data System (ADS)

In this paper we design an e-voting protocol for an academic voting system which should be independent from other university applications. We briefly discuss security requirements for e-voting schemes focusing on our proposed scheme. We design a receipt-free e-voting protocol which requires neither anonymous channel nor other physical assumptions. We give a short survey on formal analysis of e-voting protocols. Using the applied pi-calculus we model and analyze some security properties of the proposed scheme.

Novotný, Marián

326

The uniform asymptotic swallowtail approximation - Practical methods for oscillating integrals with four coalescing saddle points  

NASA Technical Reports Server (NTRS)

Methods that can be used in the numerical implementation of the uniform swallowtail approximation are described. An explicit expression for that approximation is presented to the lowest order, showing that there are three problems which must be overcome in practice before the approximation can be applied to any given problem. It is shown that a recently developed quadrature method can be used for the accurate numerical evaluation of the swallowtail canonical integral and its partial derivatives. Isometric plots of these are presented to illustrate some of their properties. The problem of obtaining the arguments of the swallowtail integral from an analytical function of its argument is considered, describing two methods of solving this problem. The asymptotic evaluation of the butterfly canonical integral is addressed.

Connor, J. N. L.; Curtis, P. R.; Farrelly, D.

1984-01-01

327

Degradation of learned skills. Effectiveness of practice methods on simulated space flight skill retention  

NASA Technical Reports Server (NTRS)

Manual flight control and emergency procedure task skill degradation was evaluated after time intervals of from 1 to 6 months. The tasks were associated with a simulated launch through the orbit insertion flight phase of a space vehicle. The results showed that acceptable flight control performance was retained for 2 months, rapidly deteriorating thereafter by a factor of 1.7 to 3.1 depending on the performance measure used. Procedural task performance showed unacceptable degradation after only 1 month, and exceeded an order of magnitude after 4 months. The effectiveness of static rehearsal (checklists and briefings) and dynamic warmup (simulator practice) retraining methods were compared for the two tasks. Static rehearsal effectively countered procedural skill degradation, while some combination of dynamic warmup appeared necessary for flight control skill retention. It was apparent that these differences between methods were not solely a function of task type or retraining method, but were a function of the performance measures used for each task.

Sitterley, T. E.; Berge, W. A.

1972-01-01

328

Clustering Methods in Symbolic Data Analysis  

Microsoft Academic Search

\\u000a We present an overview of the clustering methods developed in Symbolic Data Analysis to partition a set of conceptual data\\u000a into a fixed number of classes. The proposed algorithms are based on a generalization of the classical Dynamical Clustering\\u000a Algorithm (DCA) (Nuées Dynamiques méthode). The criterion optimized in DCA is a measure of the fit between the partition and\\u000a the

Rosanna Verde

329

Probabilistic structural analysis methods and applications  

NASA Technical Reports Server (NTRS)

An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.

Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.

1988-01-01

330

Preventing childhood obesity during infancy in UK primary care: a mixed-methods study of HCPs' knowledge, beliefs and practice  

PubMed Central

Background There is a strong rationale for intervening in early childhood to prevent obesity. Over a quarter of infants gain weight more rapidly than desirable during the first six months of life putting them at greater risk of obesity in childhood. However, little is known about UK healthcare professionals' (HCPs) approach to primary prevention. This study explored obesity-related knowledge of UK HCPs and the beliefs and current practice of general practitioners (GPs) and practice nurses in relation to identifying infants at risk of developing childhood obesity. Method Survey of UK HCPs (GPs, practice nurses, health visitors, nursery, community and children's nurses). HCPs (n = 116) rated their confidence in providing infant feeding advice and completed the Obesity Risk Knowledge Scale (ORK-10). Semi-structured interviews with a sub-set of 12 GPs and 6 practice nurses were audio recorded, taped and transcribed verbatim. Thematic analysis was applied using an interpretative, inductive approach. Results GPs were less confident about giving advice about infant feeding than health visitors (p = 0.001) and nursery nurses (p = 0.009) but more knowledgeable about the health risks of obesity (p < 0.001) than nurses (p = 0.009). HCPs who were consulted more often about feeding were less knowledgeable about the risks associated with obesity (r = -0.34, n = 114, p < 0.001). There was no relationship between HCPs' ratings of confidence in their advice and their knowledge of the obesity risk. Six main themes emerged from the interviews: 1) Attribution of childhood obesity to family environment, 2) Infant feeding advice as the health visitor's role, 3) Professional reliance on anecdotal or experiential knowledge about infant feeding, 4) Difficulties with recognition of, or lack of concern for, infants "at risk" of becoming obese, 5) Prioritising relationship with parent over best practice in infant feeding and 6) Lack of shared understanding for dealing with early years' obesity. Conclusions Intervention is needed to improve health visitors and nursery nurses' knowledge of obesity risk and GPs and practice nurses' capacity to identify and manage infants' at risk of developing childhood obesity. GPs value strategies that maintain relationships with vulnerable families and interventions to improve their advice-giving around infant feeding need to take account of this. Further research is needed to determine optimal ways of intervening with infants at risk of obesity in primary care.

2011-01-01

331

Primary prevention in general practice - views of German general practitioners: a mixed-methods study  

PubMed Central

Background Policy efforts focus on a reorientation of health care systems towards primary prevention. To guide such efforts, we analyzed the role of primary prevention in general practice and general practitioners’ (GPs) attitudes toward primary prevention. Methods Mixed-method study including a cross-sectional survey of all community-based GPs and focus groups in a sample of GPs who collaborated with the Institute of General Practice in Berlin, Germany in 2011. Of 1168 GPs 474 returned the mail survey. Fifteen GPs participated in focus group discussions. Survey and interview guidelines were developed and tested to assess and discuss beliefs, attitudes, and practices regarding primary prevention. Results Most respondents considered primary prevention within their realm of responsibility (70%). Primary prevention, especially physical activity, healthy eating, and smoking cessation, was part of the GPs’ health care recommendations if they thought it was indicated. Still a quarter of survey respondents discussed reduction of alcohol consumption with their patients infrequently even when they thought it was indicated. Similarly 18% claimed that they discuss smoking cessation only sometimes. The focus groups revealed that GPs were concerned about the detrimental effects an uninvited health behavior suggestion could have on patients and were hesitant to take on the role of “health policing”. GPs saw primary prevention as the responsibility of multiple actors in a network of societal and municipal institutions. Conclusions The mixed-method study showed that primary prevention approaches such as lifestyle counseling is not well established in primary care. GPs used a selective approach to offer preventive advice based upon indication. GPs had a strong sense that a universal prevention approach carried the potential to destroy a good patient-physician relationship. Other approaches to public health may be warranted such as a multisectoral approach to population health. This type of restructuring of the health care sector may benefit patients who are unable to afford specific prevention programmes and who have competing demands that hinder their ability to focus on behavior change.

2014-01-01

332

78 FR 17142 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...  

Federal Register 2010, 2011, 2012, 2013

...FDA-2011-N-0920] RIN 0910-AG36 Current Good Manufacturing Practice and Hazard Analysis and Risk- Based...would amend our regulation for current good manufacturing practice in manufacturing, packing, or holding human food...

2013-03-20

333

Finite Volume Methods: Foundation and Analysis  

NASA Technical Reports Server (NTRS)

Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

Barth, Timothy; Ohlberger, Mario

2003-01-01

334

Spectroscopic chemical analysis methods and apparatus  

NASA Technical Reports Server (NTRS)

Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)

2013-01-01

335

A novel and practical cardiovascular magnetic resonance method to quantify mitral annular excursion and recoil applied to hypertrophic cardiomyopathy  

PubMed Central

Background We have developed a novel and practical cardiovascular magnetic resonance (CMR) technique to evaluate left ventricular (LV) mitral annular motion by tracking the atrioventricular junction (AVJ). To test AVJ motion analysis as a metric for LV function, we compared AVJ motion variables between patients with hypertrophic cardiomyopathy (HCM), a group with recognized systolic and diastolic dysfunction, and healthy volunteers. Methods We retrospectively evaluated 24 HCM patients with normal ejection fractions (EF) and 14 healthy volunteers. Using the 4-chamber view cine images, we tracked the longitudinal motion of the lateral and septal AVJ at 25 time points during the cardiac cycle. Based on AVJ displacement versus time, we calculated maximum AVJ displacement (MD) and velocity in early diastole (MVED), velocity in diastasis (VDS) and the composite index VDS/MVED. Results Patients with HCM showed significantly slower median lateral and septal AVJ recoil velocities during early diastole, but faster velocities in diastasis. We observed a 16-fold difference in VDS/MVED at the lateral AVJ [median 0.141, interquartile range (IQR) 0.073, 0.166 versus 0.009 IQR -0.006, 0.037, P?analysis took approximately 10 minutes per subject. Conclusions Atrioventricular junction motion analysis provides a practical and novel CMR method to assess mitral annular motion. In this proof of concept study we found highly statistically significant differences in mitral annular excursion and recoil between HCM patients and healthy volunteers.

2014-01-01

336

International Commercial Remote Sensing Practices and Policies: A Comparative Analysis  

NASA Astrophysics Data System (ADS)

In recent years, there has been much discussion about U.S. commercial remoteUnder the Act, the Secretary of Commerce sensing policies and how effectively theylicenses the operations of private U.S. address U.S. national security, foreignremote sensing satellite systems, in policy, commercial, and public interests.consultation with the Secretaries of Defense, This paper will provide an overview of U.S.State, and Interior. PDD-23 provided further commercial remote sensing laws,details concerning the operation of advanced regulations, and policies, and describe recentsystems, as well as criteria for the export of NOAA initiatives. It will also addressturnkey systems and/or components. In July related foreign practices, and the overall2000, pursuant to the authority delegated to legal context for trade and investment in thisit by the Secretary of Commerce, NOAA critical industry.iss ued new regulations for the industry. Licensing and Regulationsatellite systems. NOAA's program is The 1992 Land Remote Sensing Policy Act ("the Act"), and the 1994 policy on Foreign Access to Remote Sensing Space Capabilities (known as Presidential Decision Directive-23, or PDD-23) put into place an ambitious legal and policy framework for the U.S. Government's licensing of privately-owned, high-resolution satellite systems. Previously, capabilities afforded national security and observes the international obligations of the United States; maintain positive control of spacecraft operations; maintain a tasking record in conjunction with other record-keeping requirements; provide U.S. Government access to and use of data when required for national security or foreign policy purposes; provide for U.S. Government review of all significant foreign agreements; obtain U.S. Government approval for any encryption devices used; make available unenhanced data to a "sensed state" as soon as such data are available and on reasonable cost terms and conditions; make available unenhanced data as requested by the U.S. Government Archive; and, obtain a priori U.S. Government approval of all plans and procedures to deal with safe disposition of the satellite. Further information on NOAA's regulations and NOAA's licensing program is available at www.licensing.noaa.gov. Monitoring and Enforcement NOAA's enforcement mission is focused on the legislative mandate which states that the Secretary of Commerce has a continuing obligation to ensure that licensed imaging systems are operated lawfully to preserve the national security and foreign policies of the United States. NOAA has constructed an end-to-end monitoring and compliance program to review the activities of licensed companies. This program includes a pre- launch review, an operational baseline audit, and an annual comprehensive national security audit. If at any time there is suspicion or concern that a system is being operated unlawfully, a no-notice inspection may be initiated. setbacks, three U.S. companies are now operational, with more firms expected to become so in the future. While NOAA does not disclose specific systems capabilities for proprietary reasons, its current licensing resolution thresholds for general commercial availability are as follows: 0.5 meter Ground Sample Distance (GSD) for panchromatic systems, 2 meter GSD for multi-spectral systems, 3 meter Impulse Response (IPR) for Synthetic Aperture Radar systems, and 20 meter GSD for hyperspectral systems (with certain 8-meter hyperspectral derived products also licensed for commercial distribution). These thresholds are subject to change based upon foreign availability and other considerations. It should also be noted that license applications are reviewed and granted on a case-by-case basis, pursuant to each system's technology and concept of operations. In 2001, NOAA, along with the Department of Commerce's International Trade Administration, commissioned a study by the RAND Corporation to assess the risks faced by the U.S. commercial remote sensing satellite industry. In commissioning this study, NOAA's goal was to bette

Stryker, Timothy

337

A survey of castration methods and associated livestock management practices performed by bovine veterinarians in the United States  

PubMed Central

Background Castration of male calves destined for beef production is a common management practice performed in the United States amounting to approximately 15 million procedures per year. Societal concern about the moral and ethical treatment of animals is increasing. Therefore, production agriculture is faced with the challenge of formulating animal welfare policies relating to routine management practices such as castration. To enable the livestock industry to effectively respond to these challenges there is a need for more data on management practices that are commonly used in cattle production systems. The objective of this survey was to describe castration methods, adverse events and husbandry procedures performed by U.S. veterinarians at the time of castration. Invitations to participate in the survey were sent to email addresses of 1,669 members of the American Association of Bovine Practitioners and 303 members of the Academy of Veterinary Consultants. Results After partially completed surveys and missing data were omitted, 189 responses were included in the analysis. Surgical castration with a scalpel followed by testicular removal by twisting (calves <90 kg) or an emasculator (calves >90 kg) was the most common method of castration used. The potential risk of injury to the operator, size of the calf, handling facilities and experience with the technique were the most important considerations used to determine the method of castration used. Swelling, stiffness and increased lying time were the most prevalent adverse events observed following castration. One in five practitioners report using an analgesic or local anesthetic at the time of castration. Approximately 90% of respondents indicated that they vaccinate and dehorn calves at the time of castration. Over half the respondents use disinfectants, prophylactic antimicrobials and tetanus toxoid to reduce complications following castration. Conclusions The results of this survey describe current methods of castration and associated management practices employed by bovine veterinarians in the U.S. Such data are needed to guide future animal well-being research, the outcomes of which can be used to develop industry-relevant welfare guidelines.

2010-01-01

338

A Simplified Mesh Deformation Method Using Commercial Structural Analysis Software  

NASA Technical Reports Server (NTRS)

Mesh deformation in response to redefined or moving aerodynamic surface geometries is a frequently encountered task in many applications. Most existing methods are either mathematically too complex or computationally too expensive for usage in practical design and optimization. We propose a simplified mesh deformation method based on linear elastic finite element analyses that can be easily implemented by using commercially available structural analysis software. Using a prescribed displacement at the mesh boundaries, a simple structural analysis is constructed based on a spatially varying Young s modulus to move the entire mesh in accordance with the surface geometry redefinitions. A variety of surface movements, such as translation, rotation, or incremental surface reshaping that often takes place in an optimization procedure, may be handled by the present method. We describe the numerical formulation and implementation using the NASTRAN software in this paper. The use of commercial software bypasses tedious reimplementation and takes advantage of the computational efficiency offered by the vendor. A two-dimensional airfoil mesh and a three-dimensional aircraft mesh were used as test cases to demonstrate the effectiveness of the proposed method. Euler and Navier-Stokes calculations were performed for the deformed two-dimensional meshes.

Hsu, Su-Yuen; Chang, Chau-Lyan; Samareh, Jamshid

2004-01-01

339

Stirling Analysis Comparison of Commercial vs. High-Order Methods  

NASA Technical Reports Server (NTRS)

Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/ proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's Compact scheme and Dyson s Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model although sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako

2007-01-01

340

Stirling Analysis Comparison of Commercial Versus High-Order Methods  

NASA Technical Reports Server (NTRS)

Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's compact scheme and Dyson's Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model with sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako

2005-01-01

341

Review of Computational Stirling Analysis Methods  

NASA Technical Reports Server (NTRS)

Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

2004-01-01

342

Data Analysis Methods for Library Marketing  

NASA Astrophysics Data System (ADS)

Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

Minami, Toshiro; Kim, Eunja

343

Practical considerations in the analysis of complex sample survey data.  

PubMed

Large scale sample surveys often provide fertile ground for analyses by epidemiologists. Recently, survey organizations such as the National Center for Health Statistics and the United States Bureau of the Census have distributed data from large surveys to interested investigators via CD-ROM. Confronted by the richness of such databases and the historic relative lack of availability of suitable software to appropriately account for the survey design, researchers have often simply ignored the complexities of the survey and analyzed the data as if they resulted from a simple random sample. The availability of modern programs such as STATA and SUDAAN provides data analysts with the new analytical capabilities to perform design-based analyses whenever appropriate. We used data from the NHANES III and the PAQUID study to illustrate the ease of performing design-based analyses. We also compared results of analyses under both model-based and design-based scenarios. When data from complex sample surveys were analyzed using both model-based and design-based strategies, differences in point estimates and standard errors of means, regression coefficients and odds ratios were found. The differences in regression coefficients and odds ratios between the two strategies were not as great as the differences in means. The potential for differences and the availability of survey analysis software should encourage researchers to use design-based techniques to analyze data from complex sample surveys more appropriately. PMID:10587999

Lemeshow, S; Cook, E D

1999-10-01

344

Best Practices for Finite Element Analysis of Spent Nuclear Fuel Transfer, Storage, and Transportation Systems  

SciTech Connect

Storage casks and transportation packages for spent nuclear fuel (SNF) are designed to confine SNF in sealed canisters or casks, provide structural integrity during accidents, and remove decay through a storage or transportation overpack. The transfer, storage, and transportation of SNF in dry storage casks and transport packages is regulated under 10 CFR Part 72 and 10 CFR Part 71, respectively. Finite Element Analysis (FEA) is used with increasing frequency in Safety Analysis Reports and other regulatory technical evaluations related to SNF casks and packages and their associated systems. Advances in computing power have made increasingly sophisticated FEA models more feasible, and as a result, the need for careful review of such models has also increased. This paper identifies best practice recommendations that stem from recent NRC review experience. The scope covers issues common to all commercially available FEA software, and the recommendations are applicable to any FEA software package. Three specific topics are addressed: general FEA practices, issues specific to thermal analyses, and issues specific to structural analyses. General FEA practices covers appropriate documentation of the model and results, which is important for an efficient review process. The thermal analysis best practices are related to cask analysis for steady state conditions and transient scenarios. The structural analysis best practices are related to the analysis of casks and associated payload during standard handling and drop scenarios. The best practices described in this paper are intended to identify FEA modeling issues and provide insights that can help minimize associated uncertainties and errors, in order to facilitate the NRC licensing review process.

Bajwa, Christopher S.; Piotter, Jason; Cuta, Judith M.; Adkins, Harold E.; Klymyshyn, Nicholas A.; Fort, James A.; Suffield, Sarah R.

2010-08-11

345

Computational method for analysis of polyethylene biodegradation  

NASA Astrophysics Data System (ADS)

In a previous study concerning the biodegradation of polyethylene, we proposed a mathematical model based on two primary factors: the direct consumption or absorption of small molecules and the successive weight loss of large molecules due to ?-oxidation. Our model is an initial value problem consisting of a differential equation whose independent variable is time. Its unknown variable represents the total weight of all the polyethylene molecules that belong to a molecular-weight class specified by a parameter. In this paper, we describe a numerical technique to introduce experimental results into analysis of our model. We first establish its mathematical foundation in order to guarantee its validity, by showing that the initial value problem associated with the differential equation has a unique solution. Our computational technique is based on a linear system of differential equations derived from the original problem. We introduce some numerical results to illustrate our technique as a practical application of the linear approximation. In particular, we show how to solve the inverse problem to determine the consumption rate and the ?-oxidation rate numerically, and illustrate our numerical technique by analyzing the GPC patterns of polyethylene wax obtained before and after 5 weeks cultivation of a fungus, Aspergillus sp. AK-3. A numerical simulation based on these degradation rates confirms that the primary factors of the polyethylene biodegradation posed in modeling are indeed appropriate.

Watanabe, Masaji; Kawai, Fusako; Shibata, Masaru; Yokoyama, Shigeo; Sudate, Yasuhiro

2003-12-01

346

Knowledge-attitude-practice survey among Portuguese gynaecologists regarding combined hormonal contraceptives methods.  

PubMed

ABSTRACT Objectives To evaluate knowledge, attitude and practices of Portuguese gynaecologists regarding combined hormonal contraceptives. Methods A cross-sectional survey was conducted among 303 gynaecologists. Results Ninety percent of the gynaecologists considered that deciding on contraceptive methods is a process wherein the woman has her say. Efficacy, safety and the woman's preference were the major factors influencing gynaecologists, while efficacy, tolerability and ease of use were the major factors perceived by the specialists to influence the women's choice. Gynaecologists believed that only 2% of women taking the pill were 100% compliant compared to 48% of those using the patch and 75% of those using the ring. The lower risk of omission was the strong point for the latter methods. Side effects were the main reason to change to another method. Vaginal manipulation was the most difficult topic to discuss. Conclusions Most gynaecologists decided with the woman on the contraceptive method. The main reasons for the gynaecologist's recommendation of a given contraceptive method and the women's choice were different. Counselling implies an open discussion and topics related to sexuality were considered difficult to discuss. Improving communication skills and understanding women's requirements are critical for contraceptive counselling. PMID:22200109

Bombas, Teresa; Costa, Ana Rosa; Palma, Fátima; Vicente, Lisa; Sá, José Luís; Nogueira, Ana Maria; Andrade, Sofia

2012-04-01

347

A new synthesis analysis method for building logistic regression prediction models.  

PubMed

Synthesis analysis refers to a statistical method that integrates multiple univariate regression models and the correlation between each pair of predictors into a single multivariate regression model. The practical application of such a method could be developing a multivariate disease prediction model where a dataset containing the disease outcome and every predictor of interest is not available. In this study, we propose a new version of synthesis analysis that is specific to binary outcomes. We show that our proposed method possesses desirable statistical properties. We also conduct a simulation study to assess the robustness of the proposed method and compare it to a competing method. Copyright © 2014 John Wiley & Sons, Ltd. PMID:24634227

Sheng, Elisa; Zhou, Xiao Hua; Chen, Hua; Hu, Guizhou; Duncan, Ashlee

2014-07-10

348

A high-efficiency aerothermoelastic analysis method  

NASA Astrophysics Data System (ADS)

In this paper, a high-efficiency aerothermoelastic analysis method based on unified hypersonic lifting surface theory is established. The method adopts a two-way coupling form that couples the structure, aerodynamic force, and aerodynamic thermo and heat conduction. The aerodynamic force is first calculated based on unified hypersonic lifting surface theory, and then the Eckert reference temperature method is used to solve the temperature field, where the transient heat conduction is solved using Fourier's law, and the modal method is used for the aeroelastic correction. Finally, flutter is analyzed based on the p-k method. The aerothermoelastic behavior of a typical hypersonic low-aspect ratio wing is then analyzed, and the results indicate the following: (1) the combined effects of the aerodynamic load and thermal load both deform the wing, which would increase if the flexibility, size, and flight time of the hypersonic aircraft increase; (2) the effect of heat accumulation should be noted, and therefore, the trajectory parameters should be considered in the design of hypersonic flight vehicles to avoid hazardous conditions, such as flutter.

Wan, ZhiQiang; Wang, YaoKun; Liu, YunZhen; Yang, Chao

2014-03-01

349

A high-efficiency aerothermoelastic analysis method  

NASA Astrophysics Data System (ADS)

In this paper, a high-efficiency aerothermoelastic analysis method based on unified hypersonic lifting surface theory is established. The method adopts a two-way coupling form that couples the structure, aerodynamic force, and aerodynamic thermo and heat conduction. The aerodynamic force is first calculated based on unified hypersonic lifting surface theory, and then the Eckert reference temperature method is used to solve the temperature field, where the transient heat conduction is solved using Fourier's law, and the modal method is used for the aeroelastic correction. Finally, flutter is analyzed based on the p-k method. The aerothermoelastic behavior of a typical hypersonic low-aspect ratio wing is then analyzed, and the results indicate the following: (1) the combined effects of the aerodynamic load and thermal load both deform the wing, which would increase if the flexibility, size, and flight time of the hypersonic aircraft increase; (2) the effect of heat accumulation should be noted, and therefore, the trajectory parameters should be considered in the design of hypersonic flight vehicles to avoid hazardous conditions, such as flutter.

Wan, ZhiQiang; Wang, YaoKun; Liu, YunZhen; Yang, Chao

2014-06-01

350

A method for obtaining practical flutter-suppression control laws using results of optimal control theory  

NASA Technical Reports Server (NTRS)

The results of optimal control theory are used to synthesize a feedback filter. The feedback filter is used to force the output of the filtered frequency response to match that of a desired optimal frequency response over a finite frequency range. This matching is accomplished by employing a nonlinear programing algorithm to search for the coefficients of the feedback filter that minimize the error between the optimal frequency response and the filtered frequency response. The method is applied to the synthesis of an active flutter-suppression control law for an aeroelastic wind-tunnel model. It is shown that the resulting control law suppresses flutter over a wide range of subsonic Mach numbers. This is a promising method for synthesizing practical control laws using the results of optimal control theory.

Newson, J. R.

1979-01-01

351

The development of a 3D risk analysis method.  

PubMed

Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future. PMID:17942221

I, Yet-Pole; Cheng, Te-Lung

2008-05-01

352

Cumulative radiation effect. Part VI: simple nomographic and tabular methods for the solution of practical problems.  

PubMed

In five previous papers, the concept of the Cumulative Radiation Effect (CRE) has been presented as a scale of accumulative sub-tolerance radiation damage. The biological effect generated in normal connective tissue by fractionated or continuous radiation therapy given in any temporal arrangement is described by the CRE on a unified scale of assessment, so that a unique value of the CRE describes a specific level of radiation effect. The basic methods of evaluating CREs were shown in these papers to facilitate a full understanding of the fundamental aspects of the CRE-system, but these methods can be time-consuming and tediuous for complex situations. In this paper, simple nomographic and tabular methods for the solution of practical problems are presented. An essential feature of solving a CRE problem is firstly to present it in a concise and readily appreciated form, and, to do this, nomenclature is introduced to describe schedules and regimes as compactly as possible. Simple algebraic equations are derived to describe the CRE achieved by multi-schedule regimes. In these equations, the equivalence conditions existing at the junctions between schedules are not explicit and the equations are based on the CREs of the constituent schedules assessed individually without reference to their context in the regime as a whole. This independent evaluations of CREs for each schedule results in a considerable simplification in the calculation of complex problems. The calculations are further simplified by the use of suitable tables and nomograms, so that the mathematics involved is reduced to simple arithmetical operations which require at the most the use of a slide rule but can be done by hand. The order of procedure in the presentation and calculation of CRE problems can be summarised in an evaluation procedure sheet. The resulting simple methods for solving practical problems of any complexity on the CRE-system are demonstrated by a number of examples. PMID:856533

Kirk, J; Gray, W M; Watson, E R

1977-01-01

353

Systematic monitoring of heathy woodlands in a Mediterranean climate--a practical assessment of methods.  

PubMed

Practical and useful vegetation monitoring methods are needed, and data compatibility and validation of remotely sensed data are desirable. Methods have not been adequately tested for heathy woodlands. We tested the feasibility of detecting species composition shifts in remnant woodland in South Australia, comparing historical (1986) plot data with temporal replicates (2010). We compared the uniformity of species composition among spatially scattered versus spatially clustered plots. At two sites, we compared visual and point-intercept estimation of cover and species diversity. Species composition (presence/absence) shifted between 1986 and 2010. Species that significantly shifted in frequency had low cover. Observations of decreasing species were consistent with predictions from temperature response curves (generalised additive models) for climate change over the period. However, long-term trends could not be distinguished from medium-term dynamics or short-term changes in visibility from this dataset. Difficulties were highlighted in assessing compositional change using historical baselines established for a different purpose in terms of spatial sampling and accuracy of replicate plots, differences in standard plot methods and verification of species identifications. Spatially clustered replicate plots were more similar in species composition than spatially scattered plots, improving change detection potential but decreasing area of inference. Visual surveys detected more species than point-intercepts. Visual cover estimates differed little from point-intercepts although underestimating cover in some instances relative to intercepts. Point-intercepts provide more precise cover estimates of dominant species but took longer and were difficult in steep, heathy terrain. A decision tree based on costs and benefits is presented assessing monitoring options based on data presented. The appropriate method is a function of available resources, the need for precise cover estimates versus adequate species detection, replication and practical considerations such as access and terrain. PMID:22993028

Guerin, Greg R; Lowe, Andrew J

2013-05-01

354

An Evaluation of EPIC's Analysis of School Practice & Knowledge System. The Effective Practice Incentive Community (EPIC). Research Brief  

ERIC Educational Resources Information Center

Established in 2006 by New Leaders for New Schools[TM], the Effective Practice Incentive Community (EPIC) initiative rewards high-need urban schools showing significant gains in student achievement. In exchange, schools agree to share the practices helping to drive those gains, which they do through an in-depth study of practice, aided by the EPIC…

Sloan, Kay; Pereira-Leon, Maura; Honeyford, Michelle

2012-01-01

355

Skinner Meets Piaget on the Reggio Playground: Practical Synthesis of Applied Behavior Analysis and Developmentally Appropriate Practice Orientations  

ERIC Educational Resources Information Center

We focus on integrating developmentally appropriate practices, the project approach of Reggio Emilia, and a behavior analytic model to support a quality preschool environment. While the above practices often are considered incompatible, we have found substantial overlap and room for integration of these perspectives in practical application. With…

Warash, Bobbie; Curtis, Reagan; Hursh, Dan; Tucci, Vicci

2008-01-01

356

Methods for analysis of urinary glycosaminoglycans.  

PubMed

Methods for the analysis of urinary GAGs that can be used for or are applicable to routine assays are described. The most popular method for isolation of GAGs from a urine sample is CPC precipitation, in spite of the fact that it is time-consuming. To identify the different types of GAGs excreted, separation by one-dimensional cellulose acetate electrophoresis followed by staining with alcian blue or toluidine blue may suffice for routine purposes. Solvents such as barium acetate, calcium acetate, barbital buffer and pyridine-formic acid are used for the separation. However, the separation of the seven types of GAGs by conventional one-dimensional electrophoresis is difficult, and a discontinuous electrophoretic method with barium acetate buffer and barium acetate buffer containing ethanol has proved effective for the separation. HPLC separation methods are used for assaying the profiles of enzymatic digestion products of GAGs. Advanced HPLC methods for separating intact GAGs of different types are currently unavailable. Unsaturated disaccharides produced with heparitinase and/or heparinase from heparan sulphate and oligosaccharides produced by hyaluronidase digestion of hyaluronic acid can be separated by HPLC. For chondroitin sulphate isomers, unsaturated disaccharides produced by digestion of the samples with chondroitinase ABC or chondroitinase AC are separated by HPLC and determined by their UV absorbance or by fluorescence labelling. Highly sensitive quantitation of chondroitin sulphate isomers is possible by these methods, which are also efficient for the investigation of the constituents of GAG polymers. Some of these methods have been applied to urine samples from patients with, e.g., mucopolysaccharidoses. PMID:3062022

Kodama, C; Kodama, T; Yosizawa, Z

1988-07-29

357

New method for analysis of nonstationary signals  

PubMed Central

Background Analysis of signals by means of symbolic methods consists in calculating a measure of signal complexity, for example informational entropy or Lempel-Ziv algorithmic complexity. For construction of these entropic measures one uses distributions of symbols representing the analyzed signal. Results We introduce a new signal characteristic named sequential spectrum that is suitable for analysis of the wide group of signals, including biosignals. The paper contains a brief review of analyses of artificial signals showing features similar to those of biosignals. An example of using sequential spectrum for analyzing EEG signals registered during different stages of sleep is also presented. Conclusions Sequential spectrum is an effective tool for general description of nonstationary signals and it its advantage over Fourier spectrum. Sequential spectrum enables assessment of pathological changes in EEG-signals recorded in persons with epilepsy.

2011-01-01

358

Novel BER analysis method for M-QAM signal in analog/digital hybrid optical transmission  

NASA Astrophysics Data System (ADS)

In practical analog/digital hybrid optical transmission, carriers are modulated by video signals or digital data, and the amplitude of a multiplexed signal composed of these modulation signals is more compressed than that of the carriers. This causes a decrease in the frequency of clipping of the multiplexed signal at the laser threshold. Consequently, the BER of the M-QAM signal in a practical hybrid transmission is lower than that of the experimental results for same optical modulation index (OMI). However, it is difficult to prepare many practical modulation signals for experiments in a laboratory. Therefore, there is demand for a BER analysis method for a multiplexed signal that includes modulation signals needs to evaluate the BER and determine optimum OMI in a practical hybrid transmission. In this paper, we describe such a BER analysis method that can effectively estimate the BER in a practical hybrid transmission by using representative profiles of the modulation signals. In practical systems, a black pictures gives the largest averaged amplitude for the AM-VSB signal, and the most severe conditions for clipping occurrences. However, in such systems, the BER was greatly improved over the BER of a multiplexed signal of carriers. Furthermore, BER degradations due to clipping can be neglected for the AM-VSB signals in setting a practical OMI range.

Maeda, Kazuki; Morikura, Susumu

1998-06-01

359

Communication: Quantum polarized fluctuating charge model: A practical method to include ligand polarizability in biomolecular simulations  

NASA Astrophysics Data System (ADS)

We present a simple and practical method to include ligand electronic polarization in molecular dynamics (MD) simulation of biomolecular systems. The method involves periodically spawning quantum mechanical (QM) electrostatic potential (ESP) calculations on an extra set of computer processors using molecular coordinate snapshots from a running parallel MD simulation. The QM ESPs are evaluated for the small-molecule ligand in the presence of the electric field induced by the protein, solvent, and ion charges within the MD snapshot. Partial charges on ligand atom centers are fit through the multi-conformer restrained electrostatic potential (RESP) fit method on several successive ESPs. The RESP method was selected since it produces charges consistent with the AMBER/GAFF force-field used in the simulations. The updated charges are introduced back into the running simulation when the next snapshot is saved. The result is a simulation whose ligand partial charges continuously respond in real-time to the short-term mean electrostatic field of the evolving environment without incurring additional wall-clock time. We show that (1) by incorporating the cost of polarization back into the potential energy of the MD simulation, the algorithm conserves energy when run in the microcanonical ensemble and (2) the mean solvation free energies for 15 neutral amino acid side chains calculated with the quantum polarized fluctuating charge method and thermodynamic integration agree better with experiment relative to the Amber fixed charge force-field.

Roy Kimura, S.; Rajamani, Ramkumar; Langley, David R.

2011-12-01

360

A practical method of predicting the loudness of complex electrical stimuli  

NASA Astrophysics Data System (ADS)

The output of speech processors for multiple-electrode cochlear implants consists of current waveforms with complex temporal and spatial patterns. The majority of existing processors output sequential biphasic current pulses. This paper describes a practical method of calculating loudness estimates for such stimuli, in addition to the relative loudness contributions from different cochlear regions. The method can be used either to manipulate the loudness or levels in existing processing strategies, or to control intensity cues in novel sound processing strategies. The method is based on a loudness model described by McKay et al. [J. Acoust. Soc. Am. 110, 1514-1524 (2001)] with the addition of the simplifying approximation that current pulses falling within a temporal integration window of several milliseconds' duration contribute independently to the overall loudness of the stimulus. Three experiments were carried out with six implantees who use the CI24M device manufactured by Cochlear Ltd. The first experiment validated the simplifying assumption, and allowed loudness growth functions to be calculated for use in the loudness prediction method. The following experiments confirmed the accuracy of the method using multiple-electrode stimuli with various patterns of electrode locations and current levels.

McKay, Colette M.; Henshall, Katherine R.; Farrell, Rebecca J.; McDermott, Hugh J.

2003-04-01

361

Chromium speciation by different methods of practical use for routine in situ measurement  

NASA Astrophysics Data System (ADS)

Simple, sensitive, low-cost, and relatively rapid methods for the detection of Cr (111) and Cr (VI) species in natural waters are needed for monitoring and regulatory purposes. Conventional acidification and storage of filtered samples can be a major cause of chromium losses from the `dissolved' phase. In situ monitoring is thus of paramount importance. The practical usefulness of selected chromium speciation methods was assessed in the laboratory and in the field. Significant discrepancies were found in the Cr (VI) detection efficiency by a selective ion meter based on the diphenylcarbazide method when compared with conventional Zeeman graphite fumace AAS. The efficiency of the DGT (Diffusion gradients in thin films) method, based on the deployment in situ of gel/resin units capable of separating labile species of Cr (III) and Cr (VI), looks promising, but is limited by cost considerations and by potential complications in the presence of complexing substances. The method based on the Sephadex DEAE A-25 ion exchange resins is quite effective in the separation of Cr species, though it requires on-site facilities, is relatively time-consuming and is potentially affected by complexing substances.

Barakat, S.; Giusti, L.

2003-05-01

362

Thermal Analysis Methods for Aerobraking Heating  

NASA Technical Reports Server (NTRS)

As NASA begins exploration of other planets, a method of non-propulsively slowing vehicles at the planet, aerobraking, may become a valuable technique for managing vehicle design mass and propellant. An example of this is Mars Reconnaissance Orbiter (MRO), which will launch in late 2005 and reach Mars in March of 2006. In order to save propellant, MRO will use aerobraking to modify the initial orbit at Mars. The spacecraft will dip into the atmosphere briefly on each orbit, and during the drag pass, the atmospheric drag on the spacecraft will slow it, thus lowering the orbit apoapsis. The largest area on the spacecraft, and that most affected by the heat generated during the aerobraking process, is the solar arrays. A thermal analysis of the solar arrays was conducted at NASA Langley, to simulate their performance throughout the entire roughly 6-month period of aerobraking. Several interesting methods were used to make this analysis more rapid and robust. Two separate models were built for this analysis, one in Thermal Desktop for radiation and orbital heating analysis, and one in MSC.Patran for thermal analysis. The results from the radiation model were mapped in an automated fashion to the Patran thermal model that was used to analyze the thermal behavior during the drag pass. A high degree of automation in file manipulation as well as other methods for reducing run time were employed, since toward the end of the aerobraking period the orbit period is short, and in order to support flight operations the runs must be computed rapidly. All heating within the Patran Thermal model was combined in one section of logic, such that data mapped from the radiation model and aeroheating model, as well as skin temperature effects on the aeroheating and surface radiation, could be incorporated easily. This approach calculates the aeroheating at any given node, based on its position and temperature as well as the density and velocity at that trajectory point. Run times on several different processors, computer hard drives, and operating systems (Windows versus Linux) were evaluated.

Amundsen, Ruth M.; Gasbarre, Joseph F.; Dec, John A.

2005-01-01

363

Numerical analysis method for linear induction machines.  

NASA Technical Reports Server (NTRS)

A numerical analysis method has been developed for linear induction machines such as liquid metal MHD pumps and generators and linear motors. Arbitrary phase currents or voltages can be specified and the moving conductor can have arbitrary velocity and conductivity variations from point to point. The moving conductor is divided into a mesh and coefficients are calculated for the voltage induced at each mesh point by unit current at every other mesh point. Combining the coefficients with the mesh resistances yields a set of simultaneous equations which are solved for the unknown currents.

Elliott, D. G.

1972-01-01

364

Pseudophosphatases: methods of analysis and physiological functions.  

PubMed

Protein tyrosine phosphatases (PTPs) are key enzymes in the regulation of cellular homeostasis and signaling pathways. Strikingly, not all PTPs bear enzymatic activity. A considerable fraction of PTPs are enzymatically inactive and are known as pseudophosphatases. Despite the lack of activity they execute pivotal roles in development, cell biology and human disease. The present review is focused on the methods used to identify pseudophosphatases, their targets, and physiological roles. We present a strategy for detailed enzymatic analysis of inactive PTPs, regulation of inactive PTP domains and identification of binding partners. Furthermore, we provide a detailed overview of human pseudophosphatases and discuss their regulation of cellular processes and functions in human pathologies. PMID:24064037

Kharitidi, Dmitri; Manteghi, Sanaz; Pause, Arnim

2014-01-15

365

Method and apparatus for chromatographic quantitative analysis  

DOEpatents

An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

Fritz, James S. (Ames, IA) [Ames, IA; Gjerde, Douglas T. (Ames, IA) [Ames, IA; Schmuckler, Gabriella (Haifa, IL) [Haifa, IL

1981-06-09

366

Silent boundary methods for transient wave analysis  

NASA Astrophysics Data System (ADS)

A dynamic model is developed that is designed to absorb infinitely radiating waves in a finite, computational grid. The analysis studies the problem of soil structure interaction, where energy propagates outward, from a region near a structure, toward the boundaries. The proposed method, the extended paraxial boundary, is derived from one directional wave theories. The extended paraxial boundary is compared, both analytically and numerically, with two other transmitting boundaries, the standard viscous and unified viscous methods. Analytical results indicate that the extended paraxial boundary enjoys a distinct advantage in cancelling wave reflections. Numerical tests reveal a small superiority over the viscous approaches. The relationship between the silent boundaries and Rayleigh waves, spherically symmetric and axially symmetric waves, nonlinear waves, anisotropic media, and numerical stability are discussed.

Cohen, M.

1980-09-01

367

Knowledge, Attitude, Practice and Preferences of Contraceptive Methods in Udupi District, Karnataka  

PubMed Central

Objective To assess the knowledge, attitude, practice and preferences on contraceptive methods among the female population, to determine the association between knowledge and attitude on contraceptive methods with the variables. Materials and methods A Descriptive survey of 136 females between 18-45 year of age were done using a structured knowledge questionnaire, structured attitude scale and opinionnaire on practice and preference during the month of January 2012 to February 2012 at Moodu Alevoor village, Udupi district, Karnataka. Simple random sampling was used to select the village and purposive sampling technique was used to select the sample. Results It was shown that 48.5% were of 26-35 years of age, 92% were Hindus, 45.6% had higher secondary education, 41.2% were house wives, 55.9% had family monthly income below 5000 rupees, 49.3% were from nuclear family, 64% were married between 19-25 years, 43.3% had 2-3 years of married life and 52.2% had one pregnancy. Majority (55.9%) had one living child and 98.5% got information through health personnel. Majority (67.60%) had moderate knowledge on contraceptive methods and 17.60% had high knowledge. Majority (87.50%) had favourable attitude and 12.50% had unfavourable attitude towards contraceptive methods. From the group of studied women 38.23% did not use any contraceptive methods, 19.85% used OCPs and minimum 1.47% used injection as contraceptive method. In this study 37.5% preferred OCPs as Rank 1, male condom (22.1%) as Rank 2 and injection (16.3%) as Rank 3. There was association between knowledge with educational status (?2 = 47.14, p = 0.001), occupation (?2 =15.81, p = 0.044), family monthly income (?2 =6.473, p = 0.039) and duration of marriage (?2=6.721, p = 0.035). There was no association between attitude and the studied variables. Conclusion The study showed that majority of the females had moderate knowledge and favourable attitude.

Sheilini, Melita; Nayak, Asha

2013-01-01

368

A comprehensive evaluation of normalization methods for Illumina high-throughput RNA sequencing data analysis.  

PubMed

During the last 3 years, a number of approaches for the normalization of RNA sequencing data have emerged in the literature, differing both in the type of bias adjustment and in the statistical strategy adopted. However, as data continue to accumulate, there has been no clear consensus on the appropriate normalization method to be used or the impact of a chosen method on the downstream analysis. In this work, we focus on a comprehensive comparison of seven recently proposed normalization methods for the differential analysis of RNA-seq data, with an emphasis on the use of varied real and simulated datasets involving different species and experimental designs to represent data characteristics commonly observed in practice. Based on this comparison study, we propose practical recommendations on the appropriate normalization method to be used and its impact on the differential analysis of RNA-seq data. PMID:22988256

Dillies, Marie-Agnès; Rau, Andrea; Aubert, Julie; Hennequet-Antier, Christelle; Jeanmougin, Marine; Servant, Nicolas; Keime, Céline; Marot, Guillemette; Castel, David; Estelle, Jordi; Guernec, Gregory; Jagla, Bernd; Jouneau, Luc; Laloë, Denis; Le Gall, Caroline; Schaëffer, Brigitte; Le Crom, Stéphane; Guedj, Mickaël; Jaffrézic, Florence

2013-11-01

369

A Quantitative Analysis and Natural History of B. F. Skinner's Coauthoring Practices  

PubMed Central

This paper describes and analyzes B. F. Skinner's coauthoring practices. After identifying his 35 coauthored publications and 27 coauthors, we analyze his coauthored works by their form (e.g., journal articles) and kind (e.g., empirical); identify the journals in which he published and their type (e.g., data-type); describe his overall and local rates of publishing with his coauthors (e.g., noting breaks in the latter); and compare his coauthoring practices with his single-authoring practices (e.g., form, kind, journal type) and with those in the scientometric literature (e.g., majority of coauthored publications are empirical). We address these findings in the context of describing the natural history of Skinner's coauthoring practices. Finally, we describe some limitations in our methods and offer suggestions for future research.

McKerchar, Todd L; Morris, Edward K; Smith, Nathaniel G

2011-01-01

370

A Quantitative Analysis and Natural History of B. F. Skinner's Coauthoring Practices.  

PubMed

This paper describes and analyzes B. F. Skinner's coauthoring practices. After identifying his 35 coauthored publications and 27 coauthors, we analyze his coauthored works by their form (e.g., journal articles) and kind (e.g., empirical); identify the journals in which he published and their type (e.g., data-type); describe his overall and local rates of publishing with his coauthors (e.g., noting breaks in the latter); and compare his coauthoring practices with his single-authoring practices (e.g., form, kind, journal type) and with those in the scientometric literature (e.g., majority of coauthored publications are empirical). We address these findings in the context of describing the natural history of Skinner's coauthoring practices. Finally, we describe some limitations in our methods and offer suggestions for future research. PMID:22532732

McKerchar, Todd L; Morris, Edward K; Smith, Nathaniel G

2011-01-01

371

Perceptions and Attitudes of Medical Students towards Two Methods of Assessing Practical Anatomy Knowledge  

PubMed Central

Objectives: Traditionally, summative practical examination in anatomy takes the form of ‘spotters’ consisting of a stream of prosections, radiological images and dissections with pins indicating specific structures. Recently, we have started to administer similar examinations online using the quiz facility in Moodle™ (a free, open-source web application for producing modular internet-based courses) in addition to the traditional format. This paper reports on an investigation into students’ perceptions of each assessment environment. Methods: Over a 3-year period, practical assessment in anatomy was conducted either in traditional format or online via learning management software called Moodle™. All students exposed to the two examination formats at the College of Medicine & Health Sciences, Sultan Qaboos University, Oman, were divided into two categories: junior (Year 3) and senior (Year 4). An evaluation of their perception of both examination formats was conducted using a self-administered questionnaire consisting of restricted and free response items. Results: More than half of all students expressed a clear preference for the online environment and believed it was more exam-friendly. This preference was higher amongst senior students. Compared to females, male students preferred the online environment. Senior students were less likely to study on cadavers when the examination was conducted online. Specimen quality, ability to manage time, and seating arrangements were major advantages identified by students who preferred the online format. Conclusion: Computer-based practical examinations in anatomy appeared to be generally popular with our students. The students adopted a different approach to study when the exam was conducted online as compared to the traditional ‘steeplechase’ format.

Inuwa, Ibrahim M; Taranikanti, Varna; Al-Rawahy, Maimouna; Habbal, Omar

2011-01-01

372

Quality changes of anchovy (Stolephorus heterolobus) under refrigerated storage of different practical industrial methods in Thailand.  

PubMed

Quality changes of anchovy (Stolephorus heterolobus) muscle during 7 days of refrigerated storage with ice and without ice were studied using several indicators: changes of ATP degradation products, K-value, TVB-N, TMA-N, Lactic acid, biogenic amines, sensory and microbiological analysis. During 7-day of refrigerated storage with ice and without ice, K-value, TVB-N, TMA-N and D, L-lactic acids contents increased with longer storage time (p???0.05). Major biogenic amines found in anchovy muscle during refrigerated storage were cadaverine, agmatine and tyramine, followed by putrescine and histamine. Skin and external odour by sensory evaluation, progressive decreases were observed as refrigeration time progressed. Storage of anchovy with ice resulted in a better maintenance of sensory quality, better control microbial activity, and the slowing down of biochemical degradation mechanisms. This result introduces the use of refrigerated storage with ice as a practical preliminary chilling for anchovy during industrial processing. PMID:24493885

Chotimarkorn, Chatchawan

2014-02-01

373

An NCME Instructional Module on Developing and Administering Practice Analysis Questionnaires  

ERIC Educational Resources Information Center

The purpose of a credentialing examination is to assure the public that individuals who work in an occupation or profession have met certain standards. To be consistent with this purpose, credentialing examinations must be job related, and this requirement is typically met by developing test plans based on an empirical job or practice analysis.…

Raymond, Mark R.

2005-01-01

374

Interventions for Adolescent Struggling Readers: A Meta-Analysis with Implications for Practice  

ERIC Educational Resources Information Center

This meta-analysis offers decision-makers research-based guidance for intervening with adolescent struggling readers. The authors outline major implications for practice: (1) Adolescence is not too late to intervene. Interventions do benefit older students; (2) Older students with reading difficulties benefit from interventions focused at both the…

Scammacca, Nancy; Roberts, Greg; Vaughn, Sharon; Edmonds, Meaghan; Wexler, Jade; Reutebuch, Colleen Klein; Torgesen, Joseph K.

2007-01-01

375

Identifying Evidence-Based Practices in Special Education through High Quality Meta-Analysis  

ERIC Educational Resources Information Center

The purpose of this study was to determine if meta-analysis can be used to enhance efforts to identify evidence-based practices (EBPs). In this study, the quality of included studies acted as the moderating variable. I used the quality indicators for experimental and quasi-experimental research developed by Gersten, Fuchs, Coyne, Greenwood, and…

Friedt, Brian

2012-01-01

376

Integration of Pharmacy Practice and Pharmaceutical Analysis: Quality Assessment of Laboratory Performance.  

ERIC Educational Resources Information Center

Laboratory portions of courses in pharmacy practice and pharmaceutical analysis at the Medical University of South Carolina are integrated and coordinated to provide feedback on student performance in compounding medications. Students analyze the products they prepare, with early exposure to compendia requirements and other references. Student…

McGill, Julian E.; Holly, Deborah R.

1996-01-01

377

Using Performance Analysis for Training in an Organization Implementing ISO-9000 Manufacturing Practices: A Case Study.  

ERIC Educational Resources Information Center

This case study examines the application of the Performance Analysis for Training (PAT) Model in an organization that was implementing ISO-9000 (International Standards Organization) processes for manufacturing practices. Discusses the interaction of organization characteristics, decision maker characteristics, and analyst characteristics to…

Kunneman, Dale E.; Sleezer, Catherine M.

2000-01-01

378

Research and Practice on Curriculum Integration on Power System Analysis and Electrified Railway Power Supply System  

Microsoft Academic Search

In view of the fact that presently the undergraduates canpsilat associate power system analysis(PSA) with electrified railway power supply system(ERPSS) curriculums in classroom, effectively, according to construction characteristic on Electric Engineering and Automation Specialty (EEAS), investigation and practice on curriculum integration is proposed based on PSA and ERPSS in this paper. On the basis of broad and deep investigation, the

Hongsheng Su; Haiying Dong; Feng Zhao

2009-01-01

379

AAMFT Master Series Tapes: An Analysis of the Inclusion of Feminist Principles into Family Therapy Practice.  

ERIC Educational Resources Information Center

Content analysis of 23 American Association for Marriage and Family Therapy Master Series tapes was used to determine how well feminist behaviors have been incorporated into ideal family therapy practice. Feminist behaviors were infrequent, being evident in fewer than 3% of time blocks in event sampling and 10 of 39 feminist behaviors of the…

Haddock, Shelley A.; MacPhee, David; Zimmerman, Toni Schindler

2001-01-01

380

AN EMPIRICAL ANALYSIS OF QUALITY MANAGEMENT PRACTICES IN JAPANESE MANUFACTURING COMPANIES  

Microsoft Academic Search

Quality management represents company-wide activities to improve the quality level of products and works through customer orientation, continuous quality improvement, employees' involvement, etc. to establish and sustain a competitive advantage. This paper presents result of an empirical analysis on quality management practices and its impact on competitive performance in Japanese manufacturing companies. This result has been derived from the third

Phan Chi Anh; Yoshiki Matsui

381

A Secondary Analysis of the Impact of School Management Practices on School Performance  

ERIC Educational Resources Information Center

The purpose of this study was to conduct a secondary analysis of the impact of school management practices on school performance utilizing a survey design of School and Staffing (SASS) data collected by the National Center for Education Statistics (NCES) of the U.S. Department of Education, 1999-2000. The study identifies those school management…

Talbert, Dale A.

2009-01-01

382

Design of a practical model-observer-based image quality assessment method for CT imaging systems  

NASA Astrophysics Data System (ADS)

The channelized Hotelling observer (CHO) is a powerful method for quantitative image quality evaluations of CT systems and their image reconstruction algorithms. It has recently been used to validate the dose reduction capability of iterative image-reconstruction algorithms implemented on CT imaging systems. The use of the CHO for routine and frequent system evaluations is desirable both for quality assurance evaluations as well as further system optimizations. The use of channels substantially reduces the amount of data required to achieve accurate estimates of observer performance. However, the number of scans required is still large even with the use of channels. This work explores different data reduction schemes and designs a new approach that requires only a few CT scans of a phantom. For this work, the leave-one-out likelihood (LOOL) method developed by Hoffbeck and Landgrebe is studied as an efficient method of estimating the covariance matrices needed to compute CHO performance. Three different kinds of approaches are included in the study: a conventional CHO estimation technique with a large sample size, a conventional technique with fewer samples, and the new LOOL-based approach with fewer samples. The mean value and standard deviation of area under ROC curve (AUC) is estimated by shuffle method. Both simulation and real data results indicate that an 80% data reduction can be achieved without loss of accuracy. This data reduction makes the proposed approach a practical tool for routine CT system assessment.

Tseng, Hsin-Wu; Fan, Jiahua; Cao, Guangzhi; Kupinski, Matthew A.; Sainath, Paavana

2014-03-01

383

Nonsurgical management of hypertrophic scars: evidence-based therapies, standard practices, and emerging methods.  

PubMed

Hypertrophic scars, resulting from alterations in the normal processes of cutaneous wound healing, are characterized by proliferation of dermal tissue with excessive deposition of fibroblast-derived extracellular matrix proteins, especially collagen, over long periods, and by persistent inflammation and fibrosis. Hypertrophic scars are among the most common and frustrating problems after injury. As current aesthetic surgical techniques become more standardized and results more predictable, a fine scar may be the demarcating line between acceptable and unacceptable aesthetic results. However, hypertrophic scars remain notoriously difficult to eradicate because of the high recurrence rates and the incidence of side effects associated with available treatment methods. This review explores the various treatment methods for hypertrophic scarring described in the literature including evidence-based therapies, standard practices, and emerging methods, attempting to distinguish those with clearly proven efficiency from anecdotal reports about therapies of doubtful benefits while trying to differentiate between prophylactic measures and actual treatment methods. Unfortunately, the distinction between hypertrophic scar treatments and keloid treatments is not obvious in most reports, making it difficult to assess the efficacy of hypertrophic scar treatment. PMID:17576505

Atiyeh, Bishara S

2007-01-01

384

Practical Use of Rotordynamic Analysis to Correct a Vertical Long Shaft Pump's Whirl Problem  

SciTech Connect

The use of long shaft vertical pumps is common practice in the nuclear waste processing industry. Unfortunately, when such pumps employ plain cylindrical journal bearings, they tend to suffer from rotordynamic instability problems due to the inherent lightly-loaded condition that the vertical orientation places on the bearings. This paper describes a case study in which the authors utilized rotordynamic analysis and experimental vibration analysis to diagnose such a problem and designed replacement tilting-pad bearings to solve the problem.

Leishear, R.A.

2002-05-10

385

Practical advanced analysis for design of laterally unrestrained steel planar frames under in-plane loads  

Microsoft Academic Search

Currently, advanced analysis procedures that are suitable for practical design of steel planar frames subjected to in-plane loads cannot detect out-of-plane instability. As such, structural design using these procedures requires separate checks to ensure that out-of-plane instability does not govern the frame strength. It is thus the objective of this research to extend in-plane advanced analysis procedures to account for

Kamaiton Wongkaew

2000-01-01

386

A Practical Method for Transforming Free-Text Eligibility Criteria into Computable Criteria  

PubMed Central

Formalizing eligibility criteria in a computer-interpretable language would facilitate eligibility determination for study subjects and the identification of studies on similar patient populations. Because such formalization is extremely labor intensive, we transform the problem from one of fully capturing the semantics of criteria directly in a formal expression language to one of annotating free-text criteria in a format called ERGO Annotation. The annotation can be done manually, or it can be partially automated using natural-language processing techniques. We evaluated our approach in three ways. First, we assessed the extent to which ERGO Annotations capture the semantics of 1000 eligibility criteria randomly drawn from ClinicalTrials.gov. Second, we demonstrated the practicality of the annotation process in a feasibility study. Finally, we demonstrate the computability of ERGO Annotation by using it to (1) structure a library of eligibility criteria, (2) search for studies enrolling specified study populations, and (3) screen patients for potential eligibility for a study. We therefore demonstrate a new and practical method for incrementally capturing the semantics of free-text eligibility criteria into computable form.

Tu, Samson W.; Peleg, Mor; Carini, Simona; Bobak, Michael; Ross, Jessica; Rubin, Daniel; Sim, Ida

2011-01-01

387

Chapter 11. Community analysis-based methods  

SciTech Connect

Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

2010-05-01

388

Reforming High School Science for Low-Performing Students Using Inquiry Methods and Communities of Practice  

NASA Astrophysics Data System (ADS)

Some schools fall short of the high demand to increase science scores on state exams because low-performing students enter high school unprepared for high school science. Low-performing students are not successful in high school for many reasons. However, using inquiry methods have improved students' understanding of science concepts. The purpose of this qualitative research study was to investigate the teachers' lived experiences with using inquiry methods to motivate low-performing high school science students in an inquiry-based program called Xtreem Science. Fifteen teachers were selected from the Xtreem Science program, a program designed to assist teachers in motivating struggling science students. The research questions involved understanding (a) teachers' experiences in using inquiry methods, (b) challenges teachers face in using inquiry methods, and (c) how teachers describe student's response to inquiry methods. Strategy of data collection and analysis included capturing and understanding the teachers' feelings, perceptions, and attitudes in their lived experience of teaching using inquiry method and their experience in motivating struggling students. Analysis of interview responses revealed teachers had some good experiences with inquiry and expressed that inquiry impacted their teaching style and approach to topics, and students felt that using inquiry methods impacted student learning for the better. Inquiry gave low-performing students opportunities to catch up and learn information that moved them to the next level of science courses. Implications for positive social change include providing teachers and school district leaders with information to help improve performance of the low performing science students.

Bolden, Marsha Gail

389

Second harmonic generation by micropowders: a revision of the Kurtz-Perry method and its practical application  

NASA Astrophysics Data System (ADS)

We theoretically study the second harmonic generation by powder crystal monolayers and by thick samples of crystalline powder with particle size in the range of microns. Contrary to usual treatments, the light scattering by the particles is explicitly introduced in the model. The cases of powder in air and in an index-matching liquid under the most common experimental geometries are considered. Special attention is paid to the possibility of determining the value of some nonlinear optical coefficients from the experiments. The limitations and shortcomings of the classical Kurtz and Perry method (Kurtz and Perry in J Appl Phys 39:3798, 1968) and the most common practical misuses of it are discussed. It is argued that many of the experimental works based on that method oversimplify the technique and contain important errors. In order to obtain reliable values of the nonlinear coefficients, an appropriate experimental configuration and analysis of the data are pointed out. The analysis is especially simple in the case of uniaxial phase-matchable materials for which simple analytical expressions are derived.

Aramburu, I.; Ortega, J.; Folcia, C. L.; Etxebarria, J.

2014-07-01

390

Second harmonic generation by micropowders: a revision of the Kurtz-Perry method and its practical application  

NASA Astrophysics Data System (ADS)

We theoretically study the second harmonic generation by powder crystal monolayers and by thick samples of crystalline powder with particle size in the range of microns. Contrary to usual treatments, the light scattering by the particles is explicitly introduced in the model. The cases of powder in air and in an index-matching liquid under the most common experimental geometries are considered. Special attention is paid to the possibility of determining the value of some nonlinear optical coefficients from the experiments. The limitations and shortcomings of the classical Kurtz and Perry method (Kurtz and Perry in J Appl Phys 39:3798, 1968) and the most common practical misuses of it are discussed. It is argued that many of the experimental works based on that method oversimplify the technique and contain important errors. In order to obtain reliable values of the nonlinear coefficients, an appropriate experimental configuration and analysis of the data are pointed out. The analysis is especially simple in the case of uniaxial phase-matchable materials for which simple analytical expressions are derived.

Aramburu, I.; Ortega, J.; Folcia, C. L.; Etxebarria, J.

2013-12-01

391

Comparative Analysis of Reoviridae Reverse Genetics Methods  

PubMed Central

Effective methods to engineer the segmented, double-stranded RNA genomes of Reoviridae viruses have only recently been developed. Mammalian orthoreoviruses (MRV) and bluetongue virus (BTV) can be recovered from entirely recombinant reagents, significantly improving the capacity to study the replication, pathogenesis, and transmission of these viruses. Conversely, rotaviruses (RVs), which are the major etiological agent of severe gastroenteritis in infants and children, have thus far only been modified using single-segment replacement methods. Reoviridae reverse genetics techniques universally rely on site-specific initiation of transcription by T7 RNA polymerase to generate the authentic 5? end of recombinant RNA segments, but they vary in how the RNAs are introduced into cells: recombinant BTV is recovered by transfection of in vitro transcribed RNAs, whereas recombinant MRV and RV RNAs are transcribed intracellularly from transfected plasmid cDNAs. Additionally, several parameters have been identified in each system that are essential for recombinant virus recovery. Generating recombinant BTV requires the use of 5? capped RNAs and is enhanced by multiple rounds of RNA transfection, suggesting that translation of viral proteins is likely the rate-limiting step. For RV, the efficiency of recovery is almost entirely dependent on the strength of the selection mechanism used to isolate the single-segment recombinant RV from the unmodified helper virus. The reverse genetics methods for BTV and RV will be presented and compared to the previously described MRV methods. Analysis and comparison of each method suggest several key lines of research that might lead to a reverse genetics system for RV, analogous to those used for MRV and BTV.

Trask, Shane D.; Boehme, Karl W.; Dermody, Terence S.; Patton, John T.

2012-01-01

392

Limitations in simulator time-based human reliability analysis methods  

SciTech Connect

Developments in human reliability analysis (HRA) methods have evolved slowly. Current methods are little changed from those of almost a decade ago, particularly in the use of time-reliability relationships. While these methods were suitable as an interim step, the time (and the need) has come to specify the next evolution of HRA methods. As with any performance-oriented data source, power plant simulator data have no direct connection to HRA models. Errors reported in data are normal deficiencies observed in human performance; failures are events modeled in probabilistic risk assessments (PRAs). Not all errors cause failures; not all failures are caused by errors. Second, the times at which actions are taken provide no measure of the likelihood of failures to act correctly within an accident scenario. Inferences can be made about human reliability, but they must be made with great care. Specific limitations are discussed. Simulator performance data are useful in providing qualitative evidence of the variety of error types and their potential influences on operating systems. More work is required to combine recent developments in the psychology of error with the qualitative data collected at stimulators. Until data become openly available, however, such an advance will not be practical.

Wreathall, J.

1989-01-01

393

A hybrid method for analysis of radiation characteristics of short backfire antenna  

Microsoft Academic Search

In this article, we develop a hybrid method with clear physical interpretations of solutions and a simple and practical procedure for the analysis of radiation characteristics of a short backfire antenna (SBFA). This is a highly-efficient radiator of simple and compact construction used for communication, tracking and telemetry. We carry out the numerical calculation and obtain satisfying results

Qingsheng Zeng

1997-01-01

394

The Real-Time Case Method: Description and Analysis of the First Implementation  

ERIC Educational Resources Information Center

This article describes the first implementation of the "Real-time Case Method" (RTCM)--a new instructional practice that makes use of various technologies to create a new type of case study. Data obtained from five instructors at four business schools in the U.S. and Canada were analyzed using analytic induction. Analysis suggests RTCM was…

Kilbane, Clare; Theroux, James; Sulej, Julian; Bisson, Barry; Hay, David; Boyer, Dennis

2004-01-01

395

Analysis Method for Quantifying Vehicle Design Goals  

NASA Technical Reports Server (NTRS)

A document discusses a method for using Design Structure Matrices (DSM), coupled with high-level tools representing important life-cycle parameters, to comprehensively conceptualize a flight/ground space transportation system design by dealing with such variables as performance, up-front costs, downstream operations costs, and reliability. This approach also weighs operational approaches based on their effect on upstream design variables so that it is possible to readily, yet defensively, establish linkages between operations and these upstream variables. To avoid the large range of problems that have defeated previous methods of dealing with the complex problems of transportation design, and to cut down the inefficient use of resources, the method described in the document identifies those areas that are of sufficient promise and that provide a higher grade of analysis for those issues, as well as the linkages at issue between operations and other factors. Ultimately, the system is designed to save resources and time, and allows for the evolution of operable space transportation system technology, and design and conceptual system approach targets.

Fimognari, Peter; Eskridge, Richard; Martin, Adam; Lee, Michael

2007-01-01

396

Analysis: a contribution to psychological method  

Microsoft Academic Search

Any analysis into observable elements seems to conflict with the facts. Meaningful analysis, aspective analysis, and relational analysis are valid. These three types seem to form a progressive series in any exhaustive treatment of a psychological object. Meaningful analysis is the presentation of the object as immediately experienced. In aspective analysis certain aspects of the object are selected and within

A. J. Harris

1929-01-01

397

[Comparison of optical and ultrasound biometry and assessment of using both methods in practice].  

PubMed

Purpose: The present study compares accuracy of optical biometry (OB) and ultrasound biometry (UB) based on postoperative best corrected visual acuity (BCVA) results, and assesses the extent of the usage of the measurement methods in current practice.Methods: 335 eyes in total were operated for cataract at Beskydské o?ní centrum (Beskydy Eye Centre; BOC), Frýdek-Místek hospital, in the period between 7 February 2007 and 7 April 2010. All patients were examined using both IOL-Master and Ocu-Scan prior to the surgery. All surgeries were performed using microcoaxial phacoemulsification, 2,2 mm incision, implanting IOL AcrySof SP, SPN or SPN IQ. BCVA was examined three months after the surgery. We first calculated medians of anterior-posterior axial length (AL) values measured using both methods; with both the whole set and individual subsets created according to the eye length. Difference between the two methods was calculated in mm. We calculated accurate dioptric power of the IOL, which should have been implanted in the lens bag to ensure postoperative emmetropia, using BCVA results. With each eye, we determined the size of diopter variation of the IOLs dioptric power value for emmetropia determined by an optical biometer from the accurate value of the IOLs dioptric power. Ultrasound biometry results were processed in the same way. The SRK-T formula was used for calculation with each biometry. We also calculated the number of variations above 1 D and 2 D with both biometric methods.Results: The median of axial eye length measured using an optical biometer was 23,08 mm, and the median of axial eye length measured using ultrasound biometry was 22,93 mm. The difference between these values was 0,15 mm (150 microns), which equals the difference between average values of coincident measurement results. Average variation of dioptric power of an implanted IOL from retrospectively established optimum value of the IOLs optical power was 0,40 D lower with optical biometry and 0,16 D lower with ultrasound biometry. In the context of assessing the course of the curves of both methods created using a polynomial graph, this result confirms that the two methods correspond significantly, and therefore selecting any of the methods could not negatively impact determination of the implanted IOLs dioptric power. Comparing the frequency of variations above 1D and 2,0 D with OB and UB from the accurate value of the IOLs dioptric power, we discovered a substantially higher percentage of variations with UB - up to 25 % of the total set above 1,0 D.Conclusion: Results of comparing accuracy and comfort of AL measurement with both methods justify unambiguous preference of optical biometry over ultrasound biometry in current practice. If measurement using ultrasound probe is done correctly, results of both methods correspond significantly, and so the methods are mutually replaceable. Using ultrasound biometry is therefore adequate in case optical biometry cannot be used. Key words: optical and ultrasound biometry, accurate dioptric power of the IOL, formulas, polynomial graph. PMID:24862369

Cech, R; Utíkal, T; Juhászová, J

2014-01-01

398

Looking beyond borders: integrating best practices in benefit-risk analysis into the field of food and nutrition.  

PubMed

An integrated benefit-risk analysis aims to give guidance in decision situations where benefits do not clearly prevail over risks, and explicit weighing of benefits and risks is thus indicated. The BEPRARIBEAN project aims to advance benefit-risk analysis in the area of food and nutrition by learning from other fields. This paper constitutes the final stage of the project, in which commonalities and differences in benefit-risk analysis are identified between the Food and Nutrition field and other fields, namely Medicines, Food Microbiology, Environmental Health, Economics and Marketing-Finance, and Consumer Perception. From this, ways forward are characterized for benefit-risk analysis in Food and Nutrition. Integrated benefit-risk analysis in Food and Nutrition may advance in the following ways: Increased engagement and communication between assessors, managers, and stakeholders; more pragmatic problem-oriented framing of assessment; accepting some risk; pre- and post-market analysis; explicit communication of the assessment purpose, input and output; more human (dose-response) data and more efficient use of human data; segmenting populations based on physiology; explicit consideration of value judgments in assessment; integration of multiple benefits and risks from multiple domains; explicit recognition of the impact of consumer beliefs, opinions, views, perceptions, and attitudes on behaviour; and segmenting populations based on behaviour; the opportunities proposed here do not provide ultimate solutions; rather, they define a collection of issues to be taken account of in developing methods, tools, practices and policies, as well as refining the regulatory context, for benefit-risk analysis in Food and Nutrition and other fields. Thus, these opportunities will now need to be explored further and incorporated into benefit-risk practice and policy. If accepted, incorporation of these opportunities will also involve a paradigm shift in Food and Nutrition benefit-risk analysis towards conceiving the analysis as a process of creating shared knowledge among all stakeholders. PMID:22142687

Tijhuis, M J; Pohjola, M V; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken-Schröder, G; Poto, M; Tuomisto, J T; Ueland, O; White, B C; Holm, F; Verhagen, H

2012-01-01

399

Analysis using surface wave methods to detect shallow manmade tunnels  

NASA Astrophysics Data System (ADS)

Multi-method seismic surface wave approach was used to locate and estimate the dimensions of shallow horizontally-oriented cylindrical voids or manmade tunnels. The primary analytical methods employed were Attenuation Analysis of Rayleigh Waves (AARW), Surface Wave Common Offset (SWCO), and Spiking Filter (SF). Surface wave data were acquired at six study sites using a towed 24-channel land streamer and elastic-band accelerated weight-drop seismic source. Each site was underlain by one tunnel, nominally 1 meter in diameter and depth. The acquired surface wave data were analyzed automatically. Then interpretations compared to the field measurements to ascertain the degree of accuracy. The purpose of this research is to analyze the field response of Rayleigh waves to the presence of shallow tunnels. The SF technique used the variation of seismic signal response along a geophone array to determine void presence in the subsurface. The AARW technique was expanded for practical application, as suggested by Nasseri (2006), in order to indirectly estimate void location using a Normalized Energy Distance (NED) parameter for vertical tunnel dimension measurements and normalized Cumulative Logarithmic Decrement (CALD) values for horizontal tunnel dimension measurements. Confidence in tunnel detects is presented as a measure of NED signal strength. Conversely, false positives are reduced by AARW through analysis of sub-array data. The development of such estimations is a promising tool for engineers that require quantitative measurements of manmade tunnels in the shallow subsurface.

Putnam, Niklas Henry

400

A Qualitative Analysis of an Advanced Practice Nurse-Directed Transitional Care Model Intervention  

PubMed Central

Purpose: The purpose of this study was to describe barriers and facilitators to implementing a transitional care intervention for cognitively impaired older adults and their caregivers lead by advanced practice nurses (APNs). Design and Methods: APNs implemented an evidence-based protocol to optimize transitions from hospital to home. An exploratory, qualitative directed content analysis examined 15 narrative case summaries written by APNs and fieldnotes from biweekly case conferences. Results: Three central themes emerged: patients and caregivers having the necessary information and knowledge, care coordination, and the caregiver experience. An additional category was also identified, APNs going above and beyond. Implications: APNs implemented individualized approaches and provided care that exceeds the type of care typically staffed and reimbursed in the American health care system by applying a Transitional Care Model, advanced clinical judgment, and doing whatever was necessary to prevent negative outcomes. Reimbursement reform as well as more formalized support systems and resources are necessary for APNs to consistently provide such care to patients and their caregivers during this vulnerable time of transition.

Bradway, Christine; Trotta, Rebecca; Bixby, M.Brian; McPartland, Ellen; Wollman, M. Catherine; Kapustka, Heidi; McCauley, Kathleen; Naylor, Mary D.

2012-01-01

401

A new method for torsional critical speed calculation of practical industrial rotors  

NASA Astrophysics Data System (ADS)

A new approach to calculating the torsional critical speed of rotors is presented. The governing equations for these speeds and the method of solutions differ from existing methods such as Holzer's, and the theory and numerical algorithm are straight forward, without any change in the field variables. The rotor studied has a distributed mass and rigid disks, and consists of many shaft segments of different diameters. The exact solution for undamped torsional motion of a uniform shaft segment is applied to a practical rotor-bearing system to generate the simultaneous governing equations for the torsional critical speeds. Within the framework of the theory, the set of governing equations is completely analytical and explicit, and it does not include any approximations, such as discretization of shaft mass and polynomial approximations. A computer program for the torsional critical speeds and the related mode shapes is developed by introducing a simple recurring numerical algorithm for a 3 by 4 submatrix in calculating the determinant generated by the simultaneous equations. The numerical algorithm essentially eliminates the necessity of constructing a huge matrix. The effectiveness of the new method is demonstrated in analyses of three rotors.

Jun, Oh-Sung; Kim, Paul Y.

1993-07-01

402

Recent Advances of Information Entropy Estimation Method for Practical Hydrological Variables  

NASA Astrophysics Data System (ADS)

The concept of Shannon's information entropy has been widely used in hydrology and water resources. With the increasing application of Bayesian framework and information theory, hydrologists require a robust and accurate method for entropy estimation. Most hydrologists prefer the intuitive bin-counting method to compute entropy, while some more sophisticated methods have also been applied. In this research, we first present the special characteristics of practical hydrological variables, such as 1) zero effect. (e.g. no-rainfall days in daily precipitation series); 2) discrete effect. (e.g. the minimum unit that a measurement equipment can give); 3) optimal bin-width; 4) skewness of data. Then we design special techniques to deal with these characteristics. Furthermore, we extend the techniques to 1D, 2D and high-dimensional entropy, Kullback-Leibler divergence, mutual information and transfer entropy. The last but not the least, we also present a possible improvement of Taylor diagram based on entropy and mutual information.

Gong, W.; Yang, D.; Gupta, H. V.; Nearing, G. S.

2013-12-01

403

A Method of Streamflow Drought Analysis  

NASA Astrophysics Data System (ADS)

A method of completely describing and analyzing the stochastic process of streamflow droughts has been recommended. All important components of streamflow droughts such as deficit, duration, time of occurrence, number of streamflow droughts in a given time interval [0, t], the largest streamflow drought deficit, and the largest streamflow drought duration in a given time interval [0, t] are taken into consideration. A streamflow drought is related here to streamflow deficit. Following the theory of the supremum of a random number of random variables a stochastic model is presented for interpretation and analysis of the largest streamflow drought deficit below a given reference discharge and the largest streamflow drought duration concerning a time interval [0, t], at a given location of a river. The method is based on the assumption that streamflow droughts are independent, identically distributed random variables and that their occurrence is subject to the Poisson probability law. This paper is actually a continuation of the previous E. Zelenhasi? (1970, 1979, 1983) and P. Todorovi? (1970) works on the extremes in hydrology. Application of the method is made on the 58-year record of the Sava River at Sr. Mitrovica and on the 52-year record of Tisa River at Senta, Yugoslavia, and good agreement is found between the theoretical and empirical distribution functions for all analyzed drought components for both rivers. Only one complete example, the Sava River at Sr. Mitrovica, is given in the paper. The proposed method deals with hydrograph recessions of daily or instantaneous discharges in the region of low flows, and not with mean annual flows which were used by other investigators.

Zelenhasi?, Emir; Salvai, Atila

1987-01-01

404

Methods for spectral image analysis by exploiting spatial simplicity  

DOEpatents

Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

Keenan, Michael R. (Albuquerque, NM) [Albuquerque, NM

2010-11-23

405

An Analysis of Agricultural Mechanics Safety Practices in Agricultural Science Laboratories.  

ERIC Educational Resources Information Center

North Dakota secondary agricultural mechanics instructors were surveyed regarding instructional methods and materials, safety practices, and equipment used in the agricultural mechanics laboratory. Usable responses were received from 69 of 89 instructors via self-administered mailed questionnaires. Findings were consistent with results of similar…

Swan, Michael K.

406

Optimum compression to ventilation ratios in CPR under realistic, practical conditions: a physiological and mathematical analysis  

Microsoft Academic Search

Objective: To develop and evaluate a practical formula for the optimum ratio of compressions to ventilations in cardiopulmonary resuscitation (CPR). The optimum value of a variable is that for which a desired result is maximized. Here the desired result is assumed to be either oxygen delivery to peripheral tissues or a combination of oxygen delivery and waste product removal. Method:

Charles F. Babbs; Karl B. Kern

2002-01-01

407

Comparison of Frequency Doubling Perimetry With Humphrey Visual Field Analysis in a Glaucoma Practice  

Microsoft Academic Search

c PURPOSE: To determine the sensitivity and specificity of frequency doubling perimetry with Humphrey visual field testing used as the gold standard. c METHODS: Frequency doubling perimetry and Hum- phrey visual field testing (24-2) were performed on 29 consecutive patients in a glaucoma practice. Data for the right eye were used to calculate sensitivity, specificity, and receiver operating characteristic curves.

YOCHANAN BURNSTEIN; NANCY J. ELLISH; MICHAEL MAGBALON; EVE J. HIGGINBOTHAM

408

In-Service Teacher Training in Japan and Turkey: A Comparative Analysis of Institutions and Practices  

ERIC Educational Resources Information Center

The purpose of this study is to compare policies and practices relating to teacher in-service training in Japan and Turkey. On the basis of the findings of the study, suggestions are made about in-service training activities in Turkey. The research was carried using qualitative research methods. In-service training activities in the two education…

Bayrakci, Mustafa

2009-01-01

409

A Qualitative Analysis of an Advanced Practice Nurse-Directed Transitional Care Model Intervention  

ERIC Educational Resources Information Center

Purpose: The purpose of this study was to describe barriers and facilitators to implementing a transitional care intervention for cognitively impaired older adults and their caregivers lead by advanced practice nurses (APNs). Design and Methods: APNs implemented an evidence-based protocol to optimize transitions from hospital to home. An…

Bradway, Christine; Trotta, Rebecca; Bixby, M. Brian; McPartland, Ellen; Wollman, M. Catherine; Kapustka, Heidi; McCauley, Kathleen; Naylor, Mary D.

2012-01-01

410

Interpreting the Meaning of Grades: A Descriptive Analysis of Middle School Teachers' Assessment and Grading Practices  

ERIC Educational Resources Information Center

This descriptive, non-experimental, quantitative study was designed to answer the broad question, "What do grades mean?" Core academic subject middle school teachers from one large, suburban school district in Virginia were administered an electronic survey that asked them to report on aspects of their grading practices and assessment methods for…

Grimes, Tameshia Vaden

2010-01-01

411

Method for analysis of pain images  

US Patent & Trademark Office Database

A method uses body images and computer hardware and software to collect and analyze clinical data in patients experiencing pain. Pain location information is obtained by the drawing of an outline of the pain on a paper copy or electronic display of the body image. Composite images are generated representing aggregate data for specified patient groups. The coordinates of common anatomic landmarks on differently designed body images are mapped to each other, permitting integrated analysis of pain data, e.g., pain shape, centroid, meta centroid, from multiple body image designs and display of all pain data on a single body image design. Differences and similarities between groups of patients are displayed visually and numerically, and are used to assign the probability of a given patient belonging to a particular diagnostic group or category of disease severity.

2008-05-20

412

A study of the initial guess in Homotopy analysis method  

NASA Astrophysics Data System (ADS)

In this paper, we study on the initial guess in homotopy analysis method. To illustrate our point, we applied to homotopy analysis method to solving the (2+1)dimensional boussinesq equation and (3+1) dimensional KP equation. Comparisons are made between the exact solution and the approximation of homotopy analysis method. Both of the analytic solutions agree well with the exact solutions. The results show that homotopy analysis method is a potential analytic method in solving nonlinear problems.

Zou, Li; Zhen, Wang; Zhi, Zong; Shoufu, Tian

2012-09-01

413

Interpolation methods for shaped reflector analysis  

NASA Technical Reports Server (NTRS)

The diffraction analysis of reflector surfaces which are described only at a discrete set of locations usually leads to the requirement of an interpolation to determine the surface characteristics over a continuum of locations. Two methods of interpolation, the global and the local methods, are presented. The global interpolation representation is a closed-form or series expression valid over the entire surface. The coefficients of a series expression are found by an integration of all of the raw data. Since the number of coefficients used to describe the surface is much smaller than the number of raw data points, the integration effectively provides a smoothing of the raw data. The local interpolation provides a closed-form expression for only a small area of the reflector surface. The subreflector is divided into sectors each of which has constant discretized data. Each area segment is then locally described by a two-dimensional quadratic surface. The second derivative data give the desired smoothed values.

Galindo-Israel, Victor; Imbriale, William A.; Rahmat-Samii, Yahya; Veruttipong, Thavath

1988-01-01

414

Reducing inpatient suicide risk: using human factors analysis to improve observation practices.  

PubMed

In 1995, the Joint Commission began requiring that hospitals report reviewable sentinel events as a condition of maintaining accreditation. Since then, inpatient suicide has been the second most common sentinel event reported to the Joint Commission. The Joint Commission emphasizes the need for around-the-clock observation for inpatients assessed as at high risk for suicide. However, there is sparse literature on the observation of psychiatric patients and no systematic studies or recommendations for best practices. Medical errors can best be reduced by focusing on systems improvements rather than individual provider mistakes. The author describes how failure modes and effects analysis (FMEA) was used proactively by an inpatient psychiatric treatment team to improve psychiatric observation practices by identifying and correcting potential observation process failures. Collection and implementation of observation risk reduction strategies across health care systems is needed to identify best practices and to reduce inpatient suicides. PMID:19297628

Janofsky, Jeffrey S

2009-01-01

415

Analysis of diffraction characteristics of photopolymers by using the FDTD method  

Microsoft Academic Search

In holographic memories, photopolymer is a hopeful material as a recording medium. To use a photopolymer for holographic memories as practical recording media, it is necessary to clarify the design condition of recording\\/reproduction characteristics. The coupled-wave analysis (CWA) and the rigorous coupled-wave analysis (RCWA) are widespread methods to analyze diffraction characteristics of volume holographic gratings. However, holographic grating is more

K. Shimada; S. Yoshida; N. Yoshida; M. Yamamoto

2008-01-01

416

Evidence into practice, experimentation and quasi experimentation: are the methods up to the task?  

PubMed Central

OBJECTIVE: Methodological review of evaluations of interventions intended to help health professionals provide more effective and efficient health care, motivated by the current experience of NHS Research and Development in England. Emphasis upon the forms of research appropriate to different stages in the development and evaluation of interventions, the use of experimental and quasi experimental designs, the methods used in systematic reviews and meta analyses. METHOD: A proposed development process is derived from that used in the development of drugs. The strengths and weaknesses of different experimental and quasi experimental designs are derived from published methodological literature and first principles. Examples are drawn from the literature. RESULTS: Like pharmaceuticals, implementation interventions need to go through several stages of development before they are evaluated in designed experiments. Where there are practical reasons that make random allocation impossible in quantitative evaluations, quasi experimental methods may provide useful information, although these studies are open to bias. It is rare for a single study to provide a complete answer to important questions, and systematic reviews of all available studies should be undertaken. Meta analytic techniques go some way towards countering the low power of many existing studies, reduce the risk of bias, and avoid the subjective approaches that may be found in narrative reviews. CONCLUSIONS: The initiative taken by NHS Research and Development in examining methods to promote the uptake of research findings is welcome, but will only prove helpful if careful attention is paid to the different stages of the development process, and different research approaches are used appropriately at different stages.  

Freemantle, N.; Wood, J.; Crawford, F.

1998-01-01

417

Fatigue Analysis of Crack-Like Defect Experimental Verification of Practical Rules to Predict Initiation.  

National Technical Information Service (NTIS)

This paper presents an experimental verification of analysis methods aiming at predicting initiation of cracking by fatigue in crack-like defects existing on start up of pressure vessel components. A few calculation methods available in the literature and...

B. Autrusson D. Moulin Acker D. B. Barrachin

1988-01-01

418

Homotopy analysis method for quadratic Riccati differential equation  

Microsoft Academic Search

In this paper, the quadratic Riccati differential equation is solved by means of an analytic technique, namely the homotopy analysis method (HAM). Comparisons are made between Adomian’s decomposition method (ADM), homotopy perturbation method (HPM) and the exact solution and the homotopy analysis method. The results reveal that the proposed method is very effective and simple.

Yue Tan; Saeid Abbasbandy

2008-01-01

419

Technology transfer through a network of standard methods and recommended practices - The case of petrochemicals  

NASA Astrophysics Data System (ADS)

Technology transfer may take place in parallel with cooperative action between companies participating in the same organizational scheme or using one another as subcontractor (outsourcing). In this case, cooperation should be realized by means of Standard Methods and Recommended Practices (SRPs) to achieve (i) quality of intermediate/final products according to specifications and (ii) industrial process control as required to guarantee such quality with minimum deviation (corresponding to maximum reliability) from preset mean values of representative quality parameters. This work deals with the design of the network of SRPs needed in each case for successful cooperation, implying also the corresponding technology transfer, effectuated through a methodological framework developed in the form of an algorithmic procedure with 20 activity stages and 8 decision nodes. The functionality of this methodology is proved by presenting the path leading from (and relating) a standard test method for toluene, as petrochemical feedstock in the toluene diisocyanate production, to the (6 generations distance upstream) performance evaluation of industrial process control systems (ie., from ASTM D5606 to BS EN 61003-1:2004 in the SRPs network).

Batzias, Dimitris F.; Karvounis, Sotirios

2012-12-01

420

Flutter and Divergence Analysis using the Generalized Aeroelastic Analysis Method  

NASA Technical Reports Server (NTRS)

The Generalized Aeroelastic Analysis Method (GAAM) is applied to the analysis of three well-studied checkcases: restrained and unrestrained airfoil models, and a wing model. An eigenvalue iteration procedure is used for converging upon roots of the complex stability matrix. For the airfoil models, exact root loci are given which clearly illustrate the nature of the flutter and divergence instabilities. The singularities involved are enumerated, including an additional pole at the origin for the unrestrained airfoil case and the emergence of an additional pole on the positive real axis at the divergence speed for the restrained airfoil case. Inconsistencies and differences among published aeroelastic root loci and the new, exact results are discussed and resolved. The generalization of a Doublet Lattice Method computer code is described and the code is applied to the calculation of root loci for the wing model for incompressible and for subsonic flow conditions. The error introduced in the reduction of the singular integral equation underlying the unsteady lifting surface theory to a linear algebraic equation is discussed. Acknowledging this inherent error, the solutions of the algebraic equation by GAAM are termed 'exact.' The singularities of the problem are discussed and exponential series approximations used in the evaluation of the kernel function shown to introduce a dense collection of poles and zeroes on the negative real axis. Again, inconsistencies and differences among published aeroelastic root loci and the new 'exact' results are discussed and resolved. In all cases, aeroelastic flutter and divergence speeds and frequencies are in good agreement with published results. The GAAM solution procedure allows complete control over Mach number, velocity, density, and complex frequency. Thus all points on the computed root loci can be matched-point, consistent solutions without recourse to complex mode tracking logic or dataset interpolation, as in the k and p-k solution methods.

Edwards, John W.; Wieseman, Carol D.

2003-01-01

421

How can information extraction ease formalizing treatment processes in clinical practice guidelines?: A method and its evaluation  

Microsoft Academic Search

Objective. Formalizing clinical practice guidelines for a subsequent computer-supported processing is a challenging, but burdensome and time-consuming task. Existing methods and tools to support this task demand detailed medical knowledge, knowledge about the formal representations, and a manual modeling. Furthermore, formalized guideline doc- uments mostly fall far short in terms of readability and understandability for the human domain modeler. Methods

Katharina Kaiser; Cem Akkaya; Silvia Miksch

2007-01-01

422

Standardization of reagents and methods used in cytological and histological practice with emphasis on dyes, stains and chromogenic reagents  

Microsoft Academic Search

Summary  The need for the standardization of reagents and methods used in the histology laboratory is demonstrated. After definitions of dyes, stains, and chromogenic reagents, existing standards and standards organizations are discussed. This is followed by practical instructions on how to standardize dyes and stains through the preparation of reference materials and the development of chromatographic methods. An overview is presented

H. O. Lyon; A. P. De Leenheer; R. W. Horobin; W. E. Lambert; E. K. W. Schulte; B. Van Liedekerke; D. H. Wittekind

1994-01-01

423

Multivariate analysis of management and biosecurity practices in smallholder pig farms in Madagascar  

PubMed Central

A cross-sectional study was carried out in 2005 and 2006 in three geographical areas of Madagascar to investigate and differentiate swine farm management and biosecurity practices in smallholder farming communities. Questionnaire data from a total of 709 pig farms were analysed using multiple factor analysis (MFA) and hierarchical cluster analysis (HCA). Variables describing management and biosecurity practices were organised into five groups: structure of the farm, animal-contacts, person- and vehicle-contacts, feeding, and sanitary aspects. In general, few biosecurity measures were implemented in the pig farms included in the study. Regional differences in management and biosecurity practices emerged from the MFA and were mainly due to, in order of decreasing importance: structure of the farm, sanitary aspects, feeding and animal-contacts and, to a lesser extent, person- and vehicle-contacts. HCA resulted in the differentiation of four distinct types of farms in each of two study areas, Arivonimamo and Marovoay, while no grouping could be identified amongst farms in Ambatondrazaka area. The characterisation of the different types of smallholder pig farms will allow adapting recommendations on husbandry practices and control measures in pig farms of these regions of Madagascar. The development of tailored recommendations is essential for Malagasy smallholders who have limited resources and need to make evidence-based management changes to reduce the risk of contagious diseases in their herds.

Costard, S.; Porphyre, V.; Messad, S.; Rakotondrahanta, S.; Vidon, H.; Roger, F.; Pfeiffer, D.U.

2009-01-01

424

Task Knowledges Commonality Analysis Method (TKCAM) User's Manual.  

National Technical Information Service (NTIS)

This document is a step by step guide to Military Occupational Specialty (MOS) commonality analysis using the Task Knowledges Commonality Analysis Method (TKCAM). TKCAM is an analytical method that can be used to determine the commonality between MOSs in ...

A. Akman

1998-01-01

425

New method of analysis of crystallizer temperature profile using optical fiber DTS  

NASA Astrophysics Data System (ADS)

Continuous casting is a modern and advanced technology of steel production, which product is a blank as an intermediate product for further processing. One of the most important parts of this whole process is crystallizer. At present most of methods, describing how to analyze the temperature profile of crystallizer in operation, were published and experimentally verified. These methods include the use of thermocouples or Bragg's grids. New sophisticated method of analysis of crystallizer temperature profile is the use of optical fiber DTS based on stimulated Raman dispersion. This paper contains the first experimental measurement and method's verification, which are necessary for the deployment this method into industrial practice.

Koudelka, Petr; Pápeš, Martin; Líner, Andrej; Látal, Jan; Šiška, Petr; Vašinek, Vladimír.

2012-01-01

426

Assessment of Tissue Estrogen and Progesterone Receptor Levels: A Survey of Current Practice, Techniques, and Quantitation Methods.  

PubMed

The assessment of steroid hormone receptors in resected breast carcinoma tissue is currently the standard of practice. The traditional method for assessment of receptor status is the ligand binding assay. More recently, immunohistochemistry (IHC) has become a popular method for such testing. Despite the widespread use of IHC and the availability of many antibodies, standardization of quantitative IHC for assessment of estrogen and progesterone receptors has not been achieved. While the College of American Pathologists (CAP) offers a Quality Assurance (QA) program for IHC quantitation of estrogen receptor (ER) and progesterone receptor (PgR), no universal standard is currently recognized in assessment of ER and PgR by IHC. We surveyed 300 laboratories within the United States for their current practices regarding the assessment of ER and PgR status in breast cancer tissue specimens. Eighty usable responses were received. Forty-nine (61%) laboratories performed the assay in-house, while the remainder sent the material out for assay. All responding laboratories performing their steroid receptor analysis in-house used the IHC technique. Forty-three (80%) laboratories answering the question on material accepted for analysis performed the assay only on paraffin-embedded material, three (6%) used either paraffin block or frozen material, and two (4%) used only frozen material. Eighty-eight percent of laboratories performing steroid receptor analysis in-house used a manual quantitation technique. Four (8%) used computer-assisted image analysis, and a single laboratory used laser scanning cytometry. Eight different antibodies were used among the 44 laboratories documenting the antibody supplier, and for any given commercially prepared antibody a wide variety of dilutions were used, with the exception of the standard solution used with the Ventana antibody. Of the laboratories using manual estimation techniques, 61% simply estimated the percentage of positive cells, 29% evaluated both the intensity of staining and percentage of nuclei staining, 6% used formal H-score analysis, 2% evaluated only intensity of nuclear staining, and 2% mainly counted the percentage of nuclei staining for ER but used a formal H score in the assessment of PgR. Cutoff points for the separation of positive and negative results varied widely, with some laboratories assessing any demonstrable positivity as a positive result, while others required as many as 19% of the nuclei to stain before a specimen was declared positive. Standardization techniques differed considerably among laboratories. Eighty-six percent used the CAP program for QA. While all laboratories utilized some form of intralaboratory control for assessment of ER and PgR, the nature of that control varied from laboratory to laboratory. Our survey indicates that a majority of laboratories perform their steroid hormone receptor analysis in-house using IHC. There is considerable variability in the antibodies utilized, the dilutions applied, and the quantitation method and level of expression used to dichotomize specimens into positive and negative groups. Finally, no universal control for interlaboratory standardization appears to exist. PMID:11348363

Layfield, Lester J.; Gupta, Dilip; Mooney, Eoghan E.

2000-05-01

427

Motion detection and correction in Tl-201 SPECT imaging: A simple, practical method  

SciTech Connect

Since Tl-201 SPECT imaging requires that pts remain in an awkward position for a prolonged time, pt motion (M) is a potentially serious source of artifactual defects on tomographic (tomo) reconstructions. Thus the authors developed a simple method for detection and correction of M from SPECT images. A Co-57 point source is placed on the lower anterior chest, an area remaining in the camera's field of view throughout imaging, and is imaged concurrently with Tl. In the absence of M, this point source inscribes a straight line on planar summation of the 32 projections over 180/sup 0/. Movement is detected by deviation from this line. The number of pixels (P) of M is used to shift images so that the resultant images of the point source are linear. The method was tested in 48 consecutive patients undergoing Tl tomo. Uncorrected and corrected images were reconstructed, and long- and short-axis tomo cuts were quantitatively analyzed using circumferential profiles (CPs) of maximum counts with comparison to lower limits of normal. Extent (E) of abnormality was expressed as the % of CP points falling below normal. M was detected in 8/48 pts (17%) and was 2 P in 3 and 1 P in 5 pts. E was less following correction in 7/8 pts, with a mean decrease of 71% with 2 PM and 44% with 1 PM. Visual change in tomo after correction was dramatic with 2 PM and subtle with 1 PM. The E and location of tomo abnormality after correction more closely resembled subsequent planar imaging than did the uncorrected reconstructions. The authors conclude: 1) pt M is a common problem in SPECT Tl images; 2) when > 1 pixel, M results in major tomographic artifacts; and 3) the method described provides a simple practical approach for detection and correction of M.

Friedman, J.; Garcia, E.; Berman, D.; Bietendorf, J.; Prigent, F.; VanTrain, K.; Waxman, A.; Maddahi, J.

1984-01-01

428

A practical method for simultaneous multiple intracerebral implantations for microdialysis in rats.  

PubMed

Many experimental designs require the chronic implantation of different elements destined to act as channels that facilitate the information conveyance between the brain and some external devices or vice versa. Electrodes for electrophysiological or electrochemical recording or brain stimulation, and guide shafts for drug administration or chemical monitoring of the extracellular space are the most common examples of channels serving those purposes. The stereotaxic implantation of one or more of those experimental tools in the same antero-posterior plane is relatively easy, but surgery is nonetheless more complicated when two or more elements have to be placed using totally different coordinates. In those cases the current strategy consists in the successive implantation of the elements, waiting for the hardening of the dental acrylic destined to fix one of them in place before dealing with the next. This procedure takes time, is considerably more laborious than surgery for single elements and is particularly difficult when the elements have to be implanted in close proximity. The present report describes a method that simplifies surgery for multiple intracerebral implantation and allows the simultaneous and exact placement of as many electrodes or guide shafts as is practical in any experimental design. The method requires the previous construction of a jig or template designed to temporarily hold the elements to be implanted, allowing them to assume and keep the same positional relationship that they should have when definitively in place within the skull. The design may vary according to the type of elements to be implanted and the coordinates required for each particular experiment, but here it is illustrated describing the assembly of a particular jig for the simultaneous implantation of guide shafts for ulterior microdialysis in the prefrontal cortex (PFC), nucleus accumbens (NAC) and striatum (STR). Some rules can be derived from this particular case to make the method a more general one and suitable