These are representative sample records from related to your search topic.
For comprehensive and current results, perform a real-time search at

A practical method of harmonic analysis for power converter  

Microsoft Academic Search

The converter is one important harmonic source in power systems. For the knowledge of harmonic characteristics of a converter, it is very important to evaluate the harmonic situations of AC and DC sides, and also it is very helpful to design the harmonic filters and to suppress harmonic propagation. A practical method of harmonic evaluation for a power converter is

Jianguo Jiang; Weimin Xie; Juan Zhou



A Practical Escape and Effect Analysis for Building Lightweight Method Summaries  

E-print Network

A Practical Escape and Effect Analysis for Building Lightweight Method Summaries Sigmund Cherem,rugina} Abstract. We present a unification-based, context-sensitive escape and effect analysis that infers, indicating the heap depth beyond which objects escape; and b, a branching factor indicating the maximum

Rugina, Radu


Applying Practical Formal Methods to the Speci cation and Analysis of Security Properties  

E-print Network

system 7 , the shutdown system for the Darlington nuclear power plant 21 , and the ight programApplying Practical Formal Methods to the Speci cation and Analysis of Security Properties Constance system. This paper brie y describes our experience in applying the tools in the development of two secure


A Topography Analysis Incorporated Optimization Method for the Selection and Placement of Best Management Practices  

PubMed Central

Best Management Practices (BMPs) are one of the most effective methods to control nonpoint source (NPS) pollution at a watershed scale. In this paper, the use of a topography analysis incorporated optimization method (TAIOM) was proposed, which integrates topography analysis with cost-effective optimization. The surface status, slope and the type of land use were evaluated as inputs for the optimization engine. A genetic algorithm program was coded to obtain the final optimization. The TAIOM was validated in conjunction with the Soil and Water Assessment Tool (SWAT) in the Yulin watershed in Southwestern China. The results showed that the TAIOM was more cost-effective than traditional optimization methods. The distribution of selected BMPs throughout landscapes comprising relatively flat plains and gentle slopes, suggests the need for a more operationally effective scheme, such as the TAIOM, to determine the practicability of BMPs before widespread adoption. The TAIOM developed in this study can easily be extended to other watersheds to help decision makers control NPS pollution. PMID:23349917

Shen, Zhenyao; Chen, Lei; Xu, Liang



An analysis of the teaching methods and sources of information used in adopting improved practices in rice production in Texas  

E-print Network


Kibria, A. K. M. Anwarul



A practical method for incorporating Real Options analysis into US federal benefit-cost analysis procedures  

E-print Network

This research identifies how Real Options (RO) thinking might acceptably and effectively complement the current mandates for Benefit-Cost Analysis (BCA) defined by the Office of Management and Budget (OMB) in Circular A-94. ...

Rivey, Darren



Methods and practices used in incident analysis in the Finnish nuclear power industry.  


According to the Finnish Nuclear Energy Act it is licensee's responsibility to ensure safe use of nuclear energy. Radiation and Nuclear Safety Authority (STUK) is the regulatory body responsible for the state supervision of the safe use of nuclear power in Finland. One essential prerequisite for the safe and reliable operation of nuclear power plants is that lessons are learned from the operational experience. It is utility's prime responsibility to assess the operational events and implement appropriate corrective actions. STUK controls licensees' operational experience feedback arrangements and implementation as part of its inspection activities. In addition to this in Finland, the regulatory body performs its own assessment of the operational experience. Review and investigation of operational events is a part of the regulatory oversight of operational safety. Review of operational events is done by STUK basically at three different levels. First step is to perform a general review of all operational events, transients and reactor scram reports, which the licensees submit for information to STUK. The second level activities are related to the clarification of events at site and entering of events' specific data into the event register database of STUK. This is done for events which meet the set criteria for the operator to submit a special report to STUK for approval. Safety significance of operational events is determined using probabilistic safety assessment (PSA) techniques. Risk significance of events and the number of safety significant events are followed by STUK indicators. The final step in operational event assessment performed by STUK is to assign STUK's own investigation team for events deemed to have special importance, especially when the licensee's organisation has not operated as planned. STUK launches its own detail investigation once a year on average. An analysis and evaluation of event investigation methods applied at STUK, and at the two Finnish nuclear power plant operators Teollisuuden Voima Oy (TVO) and Fortum Power and Heat Oy (Fortum) was carried out by the Technical Research Centre (VTT) on request of STUK at the end of 1990s. The study aimed at providing a broad overview and suggestions for improvement of the whole organisational framework to support event investigation practices at the regulatory body and at the utilities. The main objective of the research was to evaluate the adequacy and reliability of event investigation analysis methods and practices in the Finnish nuclear power industry and based on the results to further develop them. The results and suggestions of the research are reviewed in the paper and the corrective actions implemented in event investigation and operating experience procedures both at STUK and at utilities are discussed as well. STUK has developed its own procedure for the risk-informed analysis of nuclear power plant events. The PSA based event analysis method is used to assess the safety significance and importance measures associated with the unavailability of components and systems subject to Technical Specifications. The insights from recently performed PSA based analyses are also briefly discussed in the paper. PMID:15231350

Suksi, Seija



Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory?Determination of Trihalomethane Formation Potential, Method Validation, and Quality-Control Practices  

USGS Publications Warehouse

An analytical method for the determination of the trihalomethane formation potential of water samples has been developed. The trihalomethane formation potential is measured by dosing samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine, and then analyzing the resulting trihalomethanes by purge and trap/gas chromatography equipped with an electron capture detector. Detailed explanations of the method and quality-control practices are provided. Method validation experiments showed that the trihalomethane formation potential varies as a function of time between sample collection and analysis, residual-free chlorine concentration, method of sample dilution, and the concentration of bromide in the sample.

Crepeau, Kathryn L.; Fram, Miranda S.; Bush, Noel



Empowering Discourse: Discourse Analysis as Method and Practice in the Sociology Classroom  

ERIC Educational Resources Information Center

Collaborative learning and critical pedagogy are widely recognized as "empowering" pedagogies for higher education. Yet, the practical implementation of both has a mixed record. The question, then, is: How could collaborative and critical pedagogies be empowered themselves? This paper makes a primarily theoretical case for discourse…

Hjelm, Titus



Evaluating the clinical appropriateness of nurses' prescribing practice: method development and findings from an expert panel analysis  

PubMed Central

Background The number of nurses independently prescribing medicines in England is rising steadily. There had been no attempt systematically to evaluate the clinical appropriateness of nurses' prescribing decisions. Aims (i) To establish a method of assessing the clinical appropriateness of nurses' prescribing decisions; (ii) to evaluate the prescribing decisions of a sample of nurses, using this method. Method A modified version of the Medication Appropriateness Index (MAI) was developed, piloted and subsequently used by seven medical prescribing experts to rate transcripts of 12 nurse prescriber consultations selected from a larger database of 118 audio?recorded consultations collected as part of a national evaluation. Experts were also able to give written qualitative comments on each of the MAI dimensions applied to each of the consultations. Analysis Experts' ratings were analysed using descriptive statistics. Qualitative comments were subjected to a process of content analysis to identify themes within and across both MAI items and consultations. Results Experts' application of the modified MAI to transcripts of nurse prescriber consultations demonstrated validity and feasibility as a method of assessing the clinical appropriateness of nurses' prescribing decisions. In the majority of assessments made by the expert panel, nurses' prescribing decisions were rated as clinically appropriate on all nine items in the MAI. Conclusion A valid and feasible method of assessing the clinical appropriateness of nurses' prescribing practice has been developed using a modified MAI and transcripts of audio?recorded consultations sent to a panel of prescribing experts. Prescribing nurses in this study were generally considered to be making clinically appropriate prescribing decisions. This approach to measuring prescribing appropriateness could be used as part of quality assurance in routine practice, as a method of identifying continuing professional development needs, or in future research as the expansion of non?medical prescribing continues. PMID:18055884

Latter, Sue; Maben, Jill; Myall, Michelle; Young, Amanda



APA's Learning Objectives for Research Methods and Statistics in Practice: A Multimethod Analysis  

ERIC Educational Resources Information Center

Research methods and statistics courses constitute a core undergraduate psychology requirement. We analyzed course syllabi and faculty self-reported coverage of both research methods and statistics course learning objectives to assess the concordance with APA's learning objectives (American Psychological Association, 2007). We obtained a sample of…

Tomcho, Thomas J.; Rice, Diana; Foels, Rob; Folmsbee, Leah; Vladescu, Jason; Lissman, Rachel; Matulewicz, Ryan; Bopp, Kara



Practical Thermal Evaluation Methods For HAC Fire Analysis In Type B Radiaoactive Material (RAM) Packages  

SciTech Connect

Title 10 of the United States Code of Federal Regulations Part 71 for the Nuclear Regulatory Commission (10 CFR Part 71.73) requires that Type B radioactive material (RAM) packages satisfy certain Hypothetical Accident Conditions (HAC) thermal design requirements to ensure package safety during accidental fire conditions. Compliance with thermal design requirements can be met by prototype tests, analyses only or a combination of tests and analyses. Normally, it is impractical to meet all the HAC using tests only and the analytical methods are too complex due to the multi-physics non-linear nature of the fire event. Therefore, a combination of tests and thermal analyses methods using commercial heat transfer software are used to meet the necessary design requirements. The authors, along with his other colleagues at Savannah River National Laboratory in Aiken, SC, USA, have successfully used this 'tests and analyses' approach in the design and certification of several United States' DOE/NNSA certified packages, e.g. 9975, 9977, 9978, 9979, H1700, and Bulk Tritium Shipping Package (BTSP). This paper will describe these methods and it is hoped that the RAM Type B package designers and analysts can use them for their applications.

Abramczyk, Glenn; Hensel, Stephen J; Gupta, Narendra K.



A Critical Analysis of SocINDEX and Sociological Abstracts Using an Evaluation Method for the Practicing Bibliographer  

ERIC Educational Resources Information Center

This study provides a database evaluation method for the practicing bibliographer that is more than a brief review yet less than a controlled experiment. The author establishes evaluation criteria in the context of the bibliographic instruction provided to meet the research requirements of undergraduate sociology majors at Queens College, City…

Mellone, James T.



Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory-- Determination of Dissolved Organic Carbon in Water by High Temperature Catalytic Oxidation, Method Validation, and Quality-Control Practices  

USGS Publications Warehouse

An analytical method has been developed for the determination of dissolved organic carbon concentration in water samples. This method includes the results of the tests used to validate the method and the quality-control practices used for dissolved organic carbon analysis. Prior to analysis, water samples are filtered to remove suspended particulate matter. A Shimadzu TOC-5000A Total Organic Carbon Analyzer in the nonpurgeable organic carbon mode is used to analyze the samples by high temperature catalytic oxidation. The analysis usually is completed within 48 hours of sample collection. The laboratory reporting level is 0.22 milligrams per liter.

Bird, Susan M.; Fram, Miranda S.; Crepeau, Kathryn L.



Laboratory cost analysis: a practical approach.  


This article presents a practical method for performing a cost analysis for the smaller laboratory for which computerized methods may be unavailable or unappealing. An overview of cost accounting as it fits into planning functions is presented, and three common methods for performing such analyses and appropriate applications are described. The concept of breakeven analysis and its uses are presented. Finally, a worksheet approach to cost analysis is presented, including examples that demonstrate proper use. The worksheets, although not universally applicable without modifications, use a stepwise process to achieve a simplistic but useful cost analysis. Readers are encouraged to adapt these worksheets to their own operations. PMID:10183412

Carpenter, R B



Analysis of release kinetics of ocular therapeutics from drug releasing contact lenses: Best methods and practices to advance the field.  


Several methods have been proposed to achieve an extended and controlled release of ocular therapeutics via contact lenses; however, the experimental conditions used to study the drug release vary greatly and significantly influence the release kinetics. In this paper, we examine variations in the release conditions and their effect on the release of both hydrophilic and hydrophobic drugs (ketotifen fumarate, diclofenac sodium, timolol maleate and dexamethasone) from conventional hydrogel and silicone hydrogel lenses. Drug release was studied under different conditions, varying volume, mixing rates, and temperature. Volume had the biggest effect on the release profile, which ironically is the least consistent variable throughout the literature. When a small volume (2-30 mL) was used with no forced mixing and solvent exchange every 24 h, equilibrium was reached promptly much earlier than solvent exchange, significantly damping the drug release rate and artificially extending the release duration, leading to false conclusions. Using a large volume (200-400 mL) with a 30 rpm mixing rate and no solvent exchange, the release rate and total mass released was significantly increased. In general, the release performed in small volumes with no force mixing exhibited cumulative mass release amounts of 3-12 times less than the cumulative release amounts in large volumes with mixing. Increases in mixing rate and temperature resulted in relatively small increases of 1.4 and 1.2 times, respectively in fractional mass released. These results strongly demonstrate the necessity of proper and thorough analysis of release data to assure that equilibrium is not affecting release kinetics. This is paramount for comparison of various controlled drug release methods of therapeutic contact lenses, validation of the potential of lenses as an efficient and effective means of drug delivery, as well as increasing the likelihood of only the most promising methods reaching in vivo studies. PMID:24894544

Tieppo, Arianna; Boggs, Aarika C; Pourjavad, Payam; Byrne, Mark E



Retrospective Data Analysis and Proposal of a Practical Acceptance Criterion for Inter-laboratory Cross-validation of Bioanalytical Methods Using Liquid Chromatography/Tandem Mass Spectrometry.  


The purpose of this study is to conduct a retrospective data analysis for inter-laboratory cross-validation studies to set a reasonable and practical acceptance criterion based on a number of cross-validation results. From the results of cross-validation studies for 16 compounds and their metabolites, analytical bias and variation were evaluated. The accuracy of cross-validation samples was compared with that of quality control (QC) samples with statistical comparison of the analytical variation. An acceptance criterion was derived with a confidential interval approach. As the results, while a larger bias was observed for the cross-validation samples, the bias was not fully caused by analytical variation or bias attributable to the analytical methods. The direction of the deviation between the cross-validation samples and QC samples was random and not concentration-dependent, suggesting that inter-laboratory variability such as preparation errors could be a source of bias. A derived acceptance criterion corresponds to one prescribed in the Guideline on bioanalytical method validation from the Ministry of Health, Labour and Welfare in Japan and is a little wider than one in the European Medical Agency. In conclusion, thorough retrospective data analysis revealed potential causes of larger analytical bias in inter-laboratory cross-validation studies. A derived acceptance criterion would be practical and reasonable for the inter-laboratory cross-validation study. PMID:25124547

Yoneyama, Tomoki; Kudo, Takashi; Jinno, Fumihiro; Schmidt, Eric R; Kondo, Takahiro



Practical limitations of epidemiologic methods.  

PubMed Central

Epidemiologic methods can be categorized into demographic studies of mortality and morbidity and observational studies that are either retrospective or prospective. Some of the limitations of demographic studies are illustrated by a review of one specific mortality study showing possible relationship of nuclear fallout to leukemia. Problems of accuracy of diagnosis or causes of death on death certificates, estimates of population, migration from areas of study, and the issue of "ecological fallacy" are discussed. Retrospective studies have such problems as recall of previous environmental exposure, selection bias and survivor bias. In environmental epidemiology, prospective studies have been used. The problems associated with these studies are illustrated by reviewing some of the details of the study of effects of microwave radiation on embassy employees in Moscow. The study population had to be reconstructed, individuals had to be located and information on exposure status had to be obtained by questionnaire. The relatively small size of the exposed group permitted the detection of only fairly large relative risks. Despite these limitations, epidemiologic studies have been remarkably productive in elucidating etiological factors. They are necessary since "the proper study of man is man." PMID:6653534

Lilienfeld, A M




E-print Network

of a case study. We consider survival times (e.g., time to recurrence of depression) from a clinical trial This paper is about model selection for clinical trials data. We present a modest case study to illustrate controlled clinical trial but the methods we present are applicable to many model selection problems


IT Security Analysis Best Practices and Formal Approaches  

Microsoft Academic Search

This tutorial provides an overview of the best industrial practices in IT security analysis followed by a sketch of recent\\u000a research results in this area, especially results providing formal foundations and more powerful tools for security analysis.\\u000a The conclusion suggests directions for further work to fill the gaps between formal methods and industrial practices.

Daniel Le Métayer; Inria Rhone-Alpes



A Practical Method for Watermarking Java Programs  

Microsoft Academic Search

Java programs distributed through Internet are now suffering from program theft. It is because Java programs can be easily decomposed into reusable class files and even decompiled into source code by program users. In this paper we propose a practical method that discourages program theft by embedding Java programs with a digital watermark. Embedding a program developer's copyright notation as

Akito Monden; Hajimu Iida; Ken-ichi Matsumoto; Koji Torii; Katsuro Inoue



A Method for Optimizing Waste Management and Disposal Practices Using a Group-Based Uncertainty Model for the Analysis of Characterization Data - 13191  

SciTech Connect

It is a universal requirement for characterization of radioactive waste, that the consignor shall calculate and report a Total Measurement Uncertainty (TMU) value associated with each of the measured quantities such as nuclide activity. For Non-destructive Assay systems, the TMU analysis is typically performed on an individual container basis. However, in many cases, the waste consignor treats, transports, stores and disposes of containers in groups for example by over-packing smaller containers into a larger container or emplacing containers into groups for final disposal. The current standard practice for container-group data analysis is usually to treat each container as independent and uncorrelated and use a simple summation / averaging method (or in some cases summation of TMU in quadrature) to define the overall characteristics and associated uncertainty in the container group. In reality, many groups of containers are assayed on the same system, so there will be a large degree of co-dependence in the individual uncertainty elements. Many uncertainty terms may be significantly reduced when addressing issues such as the source position and variability in matrix contents over large populations. The systematic terms encompass both inherently 'two-directional' random effects (e.g. variation of source position) and other terms that are 'one-directional' i.e. designed to account for potential sources of bias. An analysis has been performed with population groups of a variety of non-destructive assay platforms in order to define a quantitative mechanism for waste consignors to determine overall TMU for batches of containers that have been assayed on the same system. (authors)

Simpson, A.; Clapham, M.; Lucero, R.; West, J. [Pajarito Scientific Corporation, 2976 Rodeo Park Drive East, Santa Fe, NM 87505 (United States)] [Pajarito Scientific Corporation, 2976 Rodeo Park Drive East, Santa Fe, NM 87505 (United States)



Evaluation of agricultural best-management practices in the Conestoga River headwaters, Pennsylvania; methods of data collection and analysis and description of study areas  

USGS Publications Warehouse

The U.S. Geological Survey is conducting a water quality study as part of the nationally implemented Rural Clean Water Program in the headwaters of the Conestoga River, Pennsylvania. The study, which began in 1982, was designed to determine the effect of agricultural best management practices on surface--and groundwater quality. The study was concentrated in four areas within the intensively farmed, carbonate rock terrane located predominately in Lancaster County, Pennsylvania. These areas were divided into three monitoring components: (1) a Regional study area (188 sq mi): (2) a Small Watershed study area (5.82 sq mi); and (3) two field site study areas, Field-Site 1 (22.1 acres) and Field 2 (47.5 acres). The type of water quality data and the methods of data collection and analysis are presented. The monitoring strategy and description of the study areas are discussed. The locations and descriptions for all data collection locations at the four study areas are provided. (USGS)

Chichester, Douglas C.



The LMDI approach to decomposition analysis: a practical guide  

Microsoft Academic Search

In a recent study, Ang (Energy Policy 32 (2004)) compared various index decomposition analysis methods and concluded that the logarithmic mean Divisia index method is the preferred method. Since the literature on the method tends to be either too technical or specific for most potential users, this paper provides a practical guide that includes the general formulation process, summary tables

B. W. Ang



The Sherlock Holmes method in clinical practice.  


This article lists the integral elements of the Sherlock Holmes method, which is based on the intelligent collection of information through detailed observation, careful listening and thorough examination. The information thus obtained is analyzed to develop the main and alternative hypotheses, which are shaped during the deductive process until the key leading to the solution is revealed. The Holmes investigative method applied to clinical practice highlights the advisability of having physicians reason through and seek out the causes of the disease with the data obtained from acute observation, a detailed review of the medical history and careful physical examination. PMID:24457141

Sopeña, B



Visionlearning: Research Methods: The Practice of Science  

NSDL National Science Digital Library

This instructional module introduces four types of research methods: experimentation, description, comparison, and modeling. It was developed to help learners understand that the classic definition of the "scientific method" does not capture the dynamic nature of science investigation. As learners explore each methodology, they develop an understanding of why scientists use multiple methods to gather data and develop hypotheses. It is appropriate for introductory physics courses and for teachers seeking content support in research practices. Editor's Note: Secondary students often cling to the notion that scientific research follows a stock, standard "scientific method". They may be unaware of the differences between experimental research, correlative studies, observation, and computer-based modeling research. In this resource, they can glimpse each methodology in the context of a real study done by respected scientists. This resource is part of Visionlearning, an award-winning set of classroom-tested modules for science education.

Carpi, Anthony; Egger, Anne


Practical Considerations for Using Exploratory Factor Analysis in Educational Research  

ERIC Educational Resources Information Center

The uses and methodology of factor analysis are widely debated and discussed, especially the issues of rotational use, methods of confirmatory factor analysis, and adequate sample size. The variety of perspectives and often conflicting opinions can lead to confusion among researchers about best practices for using factor analysis. The focus of the…

Beavers, Amy S.; Lounsbury, John W.; Richards, Jennifer K.; Huck, Schuyler W.; Skolits, Gary J.; Esquivel, Shelley L.



Practical method for balancing airplane moments  

NASA Technical Reports Server (NTRS)

The present contribution is the sequel to a paper written by Messrs. R. Fuchs, L. Hopf, and H. Hamburger, and proposes to show that the methods therein contained can be practically utilized in computations. Furthermore, the calculations leading up to the diagram of moments for three airplanes, whose performance in war service gave reason for complaint, are analyzed. Finally, it is shown what conclusions can be drawn from the diagram of moments with regard to the defects in these planes and what steps may be taken to remedy them.

Hamburger, H



An Online Forum As a Qualitative Research Method: Practical Issues  

PubMed Central

Background Despite positive aspects of online forums as a qualitative research method, very little is known about practical issues involved in using online forums for data collection, especially for a qualitative research project. Objectives The purpose of this paper is to describe the practical issues that the researchers encountered in implementing an online forum as a qualitative component of a larger study on cancer pain experience. Method Throughout the study process, the research staff recorded issues ranged from minor technical problems to serious ethical dilemmas as they arose and wrote memos about them. The memos and written records of discussions were reviewed and analyzed using the content analysis suggested by Weber. Results Two practical issues related to credibility were identified: a high response and retention rate and automatic transcripts. An issue related to dependability was the participants’ easy forgetfulness. The issues related to confirmability were difficulties in theoretical saturation and unstandardized computer and Internet jargon. A security issue related to hacking attempts was noted as well. Discussion The analysis of these issues suggests several implications for future researchers who want to use online forums as a qualitative data collection method. PMID:16849979

Im, Eun-Ok; Chee, Wonshik



Formal Fault Tree Analysis - Practical Experiences  

Microsoft Academic Search

Safety is an important requirement for many modern systems. To ensure safety of complex critical systems, well-known safety analysis methods have been formalized. This holds in particular for automation sytsems and transportation systems. In this paper we present the formalization of one of the most wide spread safety analysis methods: fault tree analysis (FTA). Formal FTA allows to rigorously reason

Frank Ortmeier; Gerhard Schellhorn



A collection of research reporting, theoretical analysis, and practical applications in science education: Examining qualitative research methods, action research, educator-researcher partnerships, and constructivist learning theory  

NASA Astrophysics Data System (ADS)

Educator-researcher partnerships are increasingly being used to improve the teaching of science. Chapter 1 provides a summary of the literature concerning partnerships, and examines the justification of qualitative methods in studying these relationships. It also justifies the use of Participatory Action Research (PAR). Empirically-based studies of educator-researcher partnership relationships are rare despite investments in their implementation by the National Science Foundation (NSF) and others. Chapter 2 describes a qualitative research project in which participants in an NSF GK-12 fellowship program were studied using informal observations, focus groups, personal interviews, and journals to identify and characterize the cultural factors that influenced the relationships between the educators and researchers. These factors were organized into ten critical axes encompassing a range of attitudes, behaviors, or values defined by two stereotypical extremes. These axes were: (1) Task Dictates Context vs. Context Dictates Task; (2) Introspection vs. Extroversion; (3) Internal vs. External Source of Success; (4) Prior Planning vs. Implementation Flexibility; (5) Flexible vs. Rigid Time Sense; (6) Focused Time vs. Multi-tasking; (7) Specific Details vs. General Ideas; (8) Critical Feedback vs. Encouragement; (9) Short Procedural vs. Long Content Repetition; and (10) Methods vs. Outcomes are Well Defined. Another ten important stereotypical characteristics, which did not fit the structure of an axis, were identified and characterized. The educator stereotypes were: (1) Rapport/Empathy; (2) Like Kids; (3) People Management; (4) Communication Skills; and (5) Entertaining. The researcher stereotypes were: (1) Community Collaboration; (2) Focus Intensity; (3) Persistent; (4) Pattern Seekers; and (5) Curiosity/Skeptical. Chapter 3 summarizes the research presented in chapter 2 into a practical guide for participants and administrators of educator-researcher partnerships. Understanding how to identify and evaluate constructivist lessons is the first step in promoting and improving constructivism in teaching. Chapter 4 summarizes a theoretically-generated series of practical criteria that define constructivism: (1) Eliciting Prior Knowledge, (2) Creating Cognitive Dissonance, (3) Application of New Knowledge with Feedback, and (4) Reflection on Learning, or Metacognition. These criteria can be used by any practitioner to evaluate the level of constructivism used in a given lesson or activity.

Hartle, R. Todd


Usability Inspection Methods after 15 Years of Research and Practice  

E-print Network

Usability Inspection Methods after 15 Years of Research and Practice Tasha Hollingsed Lockheed to these methods? Did they further evolve? Is there evidence in the research literature of their use in practice-0518 +1 915-747-5725 ABSTRACT Usability inspection methods, such as heuristic evaluation

Novick, David G.



Microsoft Academic Search

Stereological principles provide efficient and reliable tools for the determination of quantita- tivc parameters of tissue structure on sections. Some principles which allow the estimation of volumetric ratios, surface areas, surface-to-volume ratios, thicknesses of tissue or cell shccts, and the number of structures are reviewed and presented in general form; means for their practical application in electron microscopy are outlined.




Science Teaching Methods: A Rationale for Practices  

ERIC Educational Resources Information Center

This article is a version of the talk given by Jonathan Osborne as the Association for Science Education (ASE) invited lecturer at the National Science Teachers' Association Annual Convention in San Francisco, USA, in April 2011. The article provides an explanatory justification for teaching about the practices of science in school science that…

Osborne, Jonathan



Method of analysis at the U.S. Geological Survey California Water Science Center, Sacramento Laboratory - determination of haloacetic acid formation potential, method validation, and quality-control practices  

USGS Publications Warehouse

An analytical method for the determination of haloacetic acid formation potential of water samples has been developed by the U.S. Geological Survey California Water Science Center Sacramento Laboratory. The haloacetic acid formation potential is measured by dosing water samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine. The haloacetic acids formed are bromochloroacetic acid, bromodichloroacetic acid, dibromochloroacetic acid, dibromoacetic acid, dichloroacetic acid, monobromoacetic acid, monochloroacetic acid, tribromoacetic acid, and trichloroacetic acid. They are extracted, methylated, and then analyzed using a gas chromatograph equipped with an electron capture detector. Method validation experiments were performed to determine the method accuracy, precision, and detection limit for each of the compounds. Method detection limits for these nine haloacetic acids ranged from 0.11 to 0.45 microgram per liter. Quality-control practices include the use of blanks, quality-control samples, calibration verification standards, surrogate recovery, internal standard, matrix spikes, and duplicates.

Zazzi, Barbara C.; Crepeau, Kathryn L.; Fram, Miranda S.; Bergamaschi, Brian A.



3D Scanning Technology as a Standard Archaeological Tool for Pottery Analysis: Practice and Theory  

E-print Network

3D Scanning Technology as a Standard Archaeological Tool for Pottery Analysis: Practice and Theory this method as a practical and reliable tool in Archaeological research. Keywords: 3D pottery analysis as a practical tool to accompany and serve Archaeological projects, did not reach beyond its embryonic stage. One


Airphoto analysis of erosion control practices  

NASA Technical Reports Server (NTRS)

The Universal Soil Loss Equation (USLE) is a widely accepted tool for erosion prediction and conservation planning. In this study, airphoto analysis of color and color infrared 70 mm photography at a scale of 1:60,000 was used to determine the erosion control practice factor in the USLE. Information about contour tillage, contour strip cropping, and grass waterways was obtained from aerial photography for Pheasant Branch Creek watershed in Dane County, Wisconsin.

Morgan, K. M.; Morris-Jones, D. R.; Lee, G. B.; Kiefer, R. W.



A Practical Guide to Wavelet Analysis  

Microsoft Academic Search

A practical step-by-step guide to wavelet analysis is given, with examples taken from time series of the El Niño-Southern Oscillation (ENSO). The guide includes a comparison to the windowed Fourier transform, the choice of an appropriate wavelet basis function, edge effects due to finite-length time series, and the relationship between wavelet scale and Fourier frequency. New statistical significance tests for

Christopher Torrence; Gilbert P. Compo



Qualitative data analysis: conceptual and practical considerations.  


Qualitative inquiry requires that collected data is organised in a meaningful way, and this is referred to as data analysis. Through analytic processes, researchers turn what can be voluminous data into understandable and insightful analysis. This paper sets out the different approaches that qualitative researchers can use to make sense of their data including thematic analysis, narrative analysis, discourse analysis and semiotic analysis and discusses the ways that qualitative researchers can analyse their data. I first discuss salient issues in performing qualitative data analysis, and then proceed to provide some suggestions on different methods of data analysis in qualitative research. Finally, I provide some discussion on the use of computer-assisted data analysis. PMID:19642962

Liamputtong, Pranee



Scenistic Methods for Training: Applications and Practice  

ERIC Educational Resources Information Center

Purpose: This paper aims to complement an earlier article (2010) in "Journal of European Industrial Training" in which the description and theory bases of scenistic methods were presented. This paper also offers a description of scenistic methods and information on theory bases. However, the main thrust of this paper is to describe, give suggested…

Lyons, Paul R.



Courage and nursing practice: a theoretical analysis.  


This article aims to deepen the understanding of courage through a theoretical analysis of classical philosophers' work and a review of published and unpublished empirical research on courage in nursing. The authors sought answers to questions regarding how courage is understood from a philosophical viewpoint and how it is expressed in nursing actions. Four aspects were identified as relevant to a deeper understanding of courage in nursing practice: courage as an ontological concept, a moral virtue, a property of an ethical act, and a creative capacity. The literature review shed light on the complexity of the concept of courage and revealed some lack of clarity in its use. Consequently, if courage is to be used consciously to influence nurses' ethical actions it seems important to recognize its specific features. The results suggest it is imperative to foster courage among nurses and student nurses to prepare them for ethical, creative action and further the development of professional nursing practices. PMID:20801958

Lindh, Inga-Britt; Barbosa da Silva, António; Berg, Agneta; Severinsson, Elisabeth



Aircraft accidents : method of analysis  

NASA Technical Reports Server (NTRS)

This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)



[Theater of Life: theory, method and practice].  


Theater of Life is an educational model that integrate theater theories and techniques as a strategy for health education. Theater as an educational technique has become a useful tool for the health context. In this work the author discuss the role of social change as an important element in health education and suggests the use of theatrical techniques for it's promotion. Also offers information about the different approaches of popular theatre and popular education incorporated in this model, Theatre of the Oppressed by A. Boal, Popular Education by P. Freire, Poor Theater by J. Grotowoski and Education for Peace by C. Beristain and P. Cascón and explain the basic principles of each one of them. In the methodology section the author explains the different steps for implementing the strategy: solidarity and connection games, story telling technique and script development, presentation and forum. In the practice section the author shares the process of model development and the significance events that had contribute to their elaboration. PMID:10761208

Santiago, L E




ERIC Educational Resources Information Center




Methods of Spectral Analysis.  

National Technical Information Service (NTIS)

CONTENTS (Translations of Chapters 20 and 21 of 'Analysis of Luminescence', Section 5, Moscow Univ. Pub. House, 1962). Chapter 20, CHEMICAL ANALYSIS OF LUMINESCENCE: Excitation and recording of radiance during the qualitative and quantitative analysis of ...

V. L. Levshin



Practical reconstruction method for bioluminescence tomography  

NASA Astrophysics Data System (ADS)

Bioluminescence tomography (BLT) is used to localize and quantify bioluminescent sources in a small living animal. By advancing bioluminescent imaging to a tomographic framework, it helps to diagnose diseases, monitor therapies and facilitate drug development. In this paper, we establish a direct linear relationship between measured surface photon density and an unknown bioluminescence source distribution by using a finite-element method based on the diffusion approximation to the photon propagation in biological tissue. We develop a novel reconstruction algorithm to recover the source distribution. This algorithm incorporates a priori knowledge to define the permissible source region in order to enhance numerical stability and efficiency. Simulations with a numerical mouse chest phantom demonstrate the feasibility of the proposed BLT algorithm and reveal its performance in terms of source location, density, and robustness against noise. Lastly, BLT experiments are performed to identify the location and power of two light sources in a physical mouse chest phantom.

Cong, Wenxiang; Wang, Ge; Kumar, Durairaj; Liu, Yi; Jiang, Ming; Wang, Lihong V.; Hoffman, Eric A.; McLennan, Geoffrey; McCray, Paul B.; Zabner, Joseph; Cong, Alexander



Practical aspects of spatially high accurate methods  

NASA Technical Reports Server (NTRS)

The computational qualities of high order spatially accurate methods for the finite volume solution of the Euler equations are presented. Two dimensional essentially non-oscillatory (ENO), k-exact, and 'dimension by dimension' ENO reconstruction operators are discussed and compared in terms of reconstruction and solution accuracy, computational cost and oscillatory behavior in supersonic flows with shocks. Inherent steady state convergence difficulties are demonstrated for adaptive stencil algorithms. An exact solution to the heat equation is used to determine reconstruction error, and the computational intensity is reflected in operation counts. Standard MUSCL differencing is included for comparison. Numerical experiments presented include the Ringleb flow for numerical accuracy and a shock reflection problem. A vortex-shock interaction demonstrates the ability of the ENO scheme to excel in simulating unsteady high-frequency flow physics.

Godfrey, Andrew G.; Mitchell, Curtis R.; Walters, Robert W.



Optimizing Distributed Practice: Theoretical Analysis and Practical Implications  

ERIC Educational Resources Information Center

More than a century of research shows that increasing the gap between study episodes using the same material can enhance retention, yet little is known about how this so-called distributed practice effect unfolds over nontrivial periods. In two three-session laboratory studies, we examined the effects of gap on retention of foreign vocabulary,…

Cepeda, Nicholas J.; Coburn, Noriko; Rohrer, Doug; Wixted, John T.; Mozer, Michael C,; Pashler, Harold



Practical Issues in Component Aging Analysis  

SciTech Connect

This paper examines practical issues in the statistical analysis of component aging data. These issues center on the stochastic process chosen to model component failures. The two stochastic processes examined are repair same as new, leading to a renewal process, and repair same as old, leading to a nonhomogeneous Poisson process. Under the first assumption, times between failures can treated as statistically independent observations from a stationary process. The common distribution of the times between failures is called the renewal distribution. Under the second process, the times between failures will not be independently and identically distributed, and one cannot simply fit a renewal distribution to the cumulative failure times or the times between failures. The paper illustrates how the assumption made regarding the repair process is crucial to the analysis. Besides the choice of stochastic process, other issues that are discussed include qualitative graphical analysis and simple nonparametric hypothesis tests to help judge which process appears more appropriate. Numerical examples are presented to illustrate the issues discussed in the paper.

Dana L. Kelly; Andrei Rodionov; Jens Uwe-Klugel



Methods of gas analysis  

SciTech Connect

Methods for sampling, calibrating, and analyzing for helium, impurities in helium, and natural gases are described. These methods were developed by the US Bureau of Mines to assist in the processing of natural gas for helium recovery. 35 refs.

Emerson, D.E. (Bureau of Mines, Amarillo, TX (United States))



Multi-criteria decision analysis: Limitations, pitfalls, and practical difficulties  

SciTech Connect

The 2002 Winter Olympics women's figure skating competition is used as a case study to illustrate some of the limitations, pitfalls, and practical difficulties of Multi-Criteria Decision Analysis (MCDA). The paper compares several widely used models for synthesizing the multiple attributes into a single aggregate value. The various MCDA models can provide conflicting rankings of the alternatives for a common set of information even under states of certainty. Analysts involved in MCDA need to deal with the following challenging tasks: (1) selecting an appropriate analysis method, and (2) properly interpreting the results. An additional trap is the availability of software tools that implement specific MCDA models that can beguile the user with quantitative scores. These conclusions are independent of the decision domain and they should help foster better MCDA practices in many fields including systems engineering trade studies.

Kujawski, Edouard



The practice of formal methods in safety-critical systems  

Microsoft Academic Search

By describing several industrial-scale applicationsof formal methods, this paper intends to demonstratethat formal methods for software developmentand safety analysis are increasingly adoptedin the safety critical systems sector. The benefitsand limitations of using formal methods are described,and the problems of developing softwarefor safety critical systems are analysed.Keywords: formal methods, functional requirementsanalysis, safety analysis, safety critical systems.1...

Shaoying Liu; Bruno Dutertre



Formal Methods for Verification of Clinical Practice Guidelines  

E-print Network

. In Section 4, we briefly discuss formal meth- ods in relation to protocol development and compliance checking, as described above. Finally, in Section 5, we discuss the role of formal methods in its relation to medicalFormal Methods for Verification of Clinical Practice Guidelines Arjen HOMMERSOM a,1 , Perry GROOT

Groot, Perry


A Sampling Method Focusing on Practicality Daniel Gracia Perez  

E-print Network

A Sampling Method Focusing on Practicality Daniel Gracia P´erez CEA with such a large number of regions, (3) a bud- get-based method for jointly considering warm-up and sampling costs Abstract. In the past few years, several research works have demonstrated that sampling can drastically

Paris-Sud XI, Université de


Practical Modelling and Control System Design Methods for CAE Systems  

Microsoft Academic Search

In this paper, practical modelling and control system design methode for CAE (Computer Aided Engineering) systems are presented. The Partial Model Matching method (PMM) is effective in designing conventional PID (Proportional, Integral, Derivative) control systems used widely in industrial processes, and some extended types of PID control systems such as decoupling PID control systems and digital PID control systems. In

Yutaka lino; Takashi Shigemasa



Use of Agile Methods and Practices in the Philippines  

Microsoft Academic Search

Agile methods are increasingly gaining attention in many developed countries; however, there is a dearth of empirical studies showing their successful use in developing nations. This needs to be addressed because most software offshore outsourcing destinations are in the developing world. This paper describes experiences in the use of agile methods and practices by software development firms in the Philippines,

Raymund Sison; Theresa Yang



Optimizing distributed practice: theoretical analysis and practical implications.  


More than a century of research shows that increasing the gap between study episodes using the same material can enhance retention, yet little is known about how this so-called distributed practice effect unfolds over nontrivial periods. In two three-session laboratory studies, we examined the effects of gap on retention of foreign vocabulary, facts, and names of visual objects, with test delays up to 6 months. An optimal gap improved final recall by up to 150%. Both studies demonstrated nonmonotonic gap effects: Increases in gap caused test accuracy to initially sharply increase and then gradually decline. These results provide new constraints on theories of spacing and confirm the importance of cumulative reviews to promote retention over meaningful time periods. PMID:19439395

Cepeda, Nicholas J; Coburn, Noriko; Rohrer, Doug; Wixted, John T; Mozer, Michael C; Pashler, Harold



Applying community-oriented primary care methods in British general practice: a case study.  

PubMed Central

BACKGROUND: The '75 and over' assessments built into the 1990 contract for general practice have failed to enthuse primary care teams or make a significant impact on the health of older people. Alternative methods for improving the health of older people living at home are being sought. AIM: To test the feasibility of applying community-oriented primary care methodology to a relatively deprived sub-population of older people in a relatively deprived area. DESIGN OF STUDY: A combination of developmental and triangulation approaches to data analysis. SETTING: Four general practices in an inner London borough. METHOD: A community-oriented primary care approach was used to initiate innovative care for older people, supported financially by the health authority and practically by primary care academics. RESULTS: All four practices identified problems needing attention in the older population, developed different projects focused on particular needs among older people, and tested them in practice. Patient and public involvement were central to the design and implementation processes in only one practice. Innovations were sustained in only one practice, but some were adopted by a primary care group and others extended to a wider group of practices by the health authority. CONCLUSION: A modified community-oriented primary care approach can be used in British general practice, and changes can be promoted that are perceived as valuable by planning bodies. However, this methodology may have more impact at primary care trust level than at practice level. PMID:12171223

Iliffe, Steve; Lenihan, Penny; Wallace, Paul; Drennan, Vari; Blanchard, Martin; Harris, Andrew



Supplementary Methods Sequence Analysis  

E-print Network

calculated divergence estimates for all three datasets using the maximum likelihood method of Goldman, dosage lethality, dosage suppression, chemical lethality, and chemical rescue (Breitkreutz et al. 2003 statistics were calculated using the Pajek software package (Batagelj and Mrvar 1998). Three measures

Hahn, Matthew


Research in dental practice: a 'SWOT' analysis.  


Most dental treatment, in most countries, is carried out in general dental practice. There is therefore a potential wealth of research material, although clinical evaluations have generally been carried out on hospital-based patients. Many types of research, such as clinical evaluations and assessments of new materials, may be appropriate to dental practice. Principal problems are that dental practices are established to treat patients efficiently and to provide an income for the staff of the practice. Time spent on research therefore cannot be used for patient treatment, so there are cost implications. Critics of practice-based research have commented on the lack of calibration of operative diagnoses and other variables; however, this variability is the stuff of dental practice, the real-world situation. Many of the difficulties in carrying out research in dental practice may be overcome. For the enlightened, it may be possible to turn observations based on the volume of treatment carried out in practice into robust, clinically related and relevant research projects based in the real world of dental practice. PMID:11928346

Burke, F J T; Crisp, R J; McCord, J F



Practical challenges in the method of controlled Lagrangians  

NASA Astrophysics Data System (ADS)

The method of controlled Lagrangians is an energy shaping control technique for underactuated Lagrangian systems. Energy shaping control design methods are appealing as they retain the underlying nonlinear dynamics and can provide stability results that hold over larger domain than can be obtained using linear design and analysis. The objective of this dissertation is to identify the control challenges in applying the method of controlled Lagrangians to practical engineering problems and to suggest ways to enhance the closed-loop performance of the controller. This dissertation describes a procedure for incorporating artificial gyroscopic forces in the method of controlled Lagrangians. Allowing these energy-conserving forces in the closed-loop system provides greater freedom in tuning closed-loop system performance and expands the class of eligible systems. In energy shaping control methods, physical dissipation terms that are neglected in the control design may enter the system in a way that can compromise stability. This is well illustrated through the "ball on a beam" example. The effect of physical dissipation on the closed-loop dynamics is studied in detail and conditions for stability in the presence of natural damping are discussed. The control technique is applied to the classic "inverted pendulum on a cart" system. A nonlinear controller is developed which asymptotically stabilizes the inverted equilibrium at a specific cart position for the conservative dynamic model. The region of attraction contains all states for which the pendulum is elevated above the horizontal plane. Conditions for asymptotic stability in the presence of linear damping are developed. The nonlinear controller is validated through experiments. Experimental cart damping is best modeled using static and Coulomb friction. Experiments show that static and Coulomb friction degrades the closed-loop performance and induces limit cycles. A Lyapunov-based switching controller is proposed and successfully implemented to suppress the limit cycle oscillations. The Lyapunov-based controller switches between the energy shaping nonlinear controller, for states away from the equilibrium, and a well-tuned linear controller, for states close to the equilibrium. The method of controlled Lagrangians is applied to vehicle systems with internal moving point mass actuators. Applications of moving mass actuators include certain spacecraft, atmospheric re-entry vehicles, and underwater vehicles. Control design using moving mass actuators is challenging; the system is often underactuated and multibody dynamic models are higher dimensional. We consider two examples to illustrate the application of controlled Lagrangian formulation. The first example is a spinning disk, a simplified, planar version of a spacecraft spin stabilization problem. The second example is a planar, streamlined underwater vehicle.

Chevva, Konda Reddy


That's another story: narrative methods and ethical practice  

PubMed Central

This paper examines the use of case studies in ethics education. While not dismissing their value for specific purposes, the paper shows the limits of their use. While agreeing that case studies are narratives, although rather thin stories, the paper argues that the claim that case studies could represent reality is difficult to sustain. Instead, the paper suggests a way of using stories in ethics teaching that could be more real for students, while also giving them a way of thinking about their own professional practices. The paper shows how the method can be used to develop a more critical and reflective practice for students in the health care professions. Some immediate problems with the method are discussed. Key Words: Case study • narrative • reflexivity • identity • ethical practice PMID:11417029

Carson, A.



Nested Newton's method for ICA and post factor analysis  

Microsoft Academic Search

Two distinct topics are dealt with. First, a new method for independent component analysis (ICA) has been constructed that exploits the invariance of criteria under component-wise scaling, which is intrinsic to ICA. This practical and simple ICA method is called the nested Newton's method. When the number of the channel of observation is less than a certain level, factor analysis

Toshinao Akuzawa



Gait analysis methods in rehabilitation  

Microsoft Academic Search

INTRODUCTION: Brand's four reasons for clinical tests and his analysis of the characteristics of valid biomechanical tests for use in orthopaedics are taken as a basis for determining what methodologies are required for gait analysis in a clinical rehabilitation context. MEASUREMENT METHODS IN CLINICAL GAIT ANALYSIS: The state of the art of optical systems capable of measuring the positions of

Richard Baker; Hugh Williamson; Gait CCRE



Practical Teaching Methods K-6: Sparking the Flame of Learning.  

ERIC Educational Resources Information Center

This book provides state-of-the-art teaching practices and methods, discussing the elements of good teaching in the content areas and including examples from real classrooms and library media centers. Chapters offer reflection exercises, assessment tips specific to each curriculum, and resource lists. Nine chapters examine: (1) "The Premise"…

Wilkinson, Pamela Fannin.; McNutt, Margaret A.; Friedman, Esther S.



E-print Network

169 METHODS OF PLANKTON INVESTIGATION IN THEIR RELATION TO PRACTICAL PROBLEMS. By JACOB REIGHARD the plankton. The total mass of plankton is, in most bodies of water, so great that, in comparison with it, it is customary to neglect the fixed plants along the shore and the animals that they harbor. That the plankton


Traditional Methods for Mineral Analysis  

NASA Astrophysics Data System (ADS)

This chapter describes traditional methods for analysis of minerals involving titrimetric and colorimetric procedures, and the use of ion selective electrodes. Other traditional methods of mineral analysis include gravimetric titration (i.e., insoluble forms of minerals are precipitated, rinse, dried, and weighed) and redox reactions (i.e., mineral is part of an oxidation-reduction reaction, and product is quantitated). However, these latter two methods will not be covered because they currently are used little in the food industry. The traditional methods that will be described have maintained widespread usage in the food industry despite the development of more modern instrumentation such as atomic absorption spectroscopy and inductively coupled plasma-atomic emission spectroscopy (Chap. 24). Traditional methods generally require chemicals and equipment that are routinely available in an analytical laboratory and are within the experience of most laboratory technicians. Additionally, traditional methods often form the basis for rapid analysis kits (e.g., Quantab®; for salt determination) that are increasingly in demand. Procedures for analysis of minerals of major nutritional or food processing concern are used for illustrative purposes. For additional examples of traditional methods refer to references (1-6). Slight modifications of these traditional methods are often needed for specific foodstuffs to minimize interferences or to be in the range of analytical performance. For analytical requirements for specific foods see the Official Methods of Analysis of AOAC International (5) and related official methods (6).

Ward, Robert E.; Carpenter, Charles E.


Intelligent Best Practices Analysis Shahab D. Mohaghegh, Ph.D.  

E-print Network

Intelligent Best Practices Analysis Shahab D. Mohaghegh, Ph.D. West Virginia University 1 A New to the realities of the new economy that ties the success of oil and gas companies to their performance. The new methodology is named "Intelligent Best Practices Analysis". It incorporates a hybrid form

Mohaghegh, Shahab


[A method for the implementation and promotion of access to comprehensive and complementary primary healthcare practices].  


The rendering of integrated and complementary practices in the Brazilian Unified Health System is fostered to increase the comprehensiveness of care and access to same, though it is a challenge to incorporate them into the services. Our objective is to provide a simple method of implementation of such practices in Primary Healthcare, derived from analysis of experiences in municipalities, using partial results of a master's thesis that employed research-action methodology. The method involves four stages: 1 - defininition of a nucleus responsible for implementation and consolidation thereof; 2 - situational analysis, with definition of the existing competent professionals; 3 - regulation, organization of access and legitimation; and 4 - implementation cycle: local plans, mentoring and ongoing education in health. The phases are described, justified and briefly discussed. The method encourages the development of rational and sustainable actions, sponsors participatory management, the creation of comprehensivenessand the broadening of care provided in Primary Healthcare by offering progressive and sustainable comprehensive and complementary practices. PMID:23175308

Santos, Melissa Costa; Tesser, Charles Dalcanale



A Practical Method of Monitoring the Results of Health Care  

PubMed Central

To meet our goal of improving health care through more productive use of the data we are collecting about the delivery of health care we need to define our concepts of health and quality. The WHO definition of health allows the design of useful functional outcome criteria which give us measurable standards for the outcome of the health care. By recording, retrieving, and reviewing pertinent information from the structure and the process of health care for a valid comparison with its outcome, the most effective and efficient health care is identified. A practical system is presented which identifies the better methods of management and produces the motivation for change that results in improved care. The successful use of this system in a private practice supports its universal adaptability for health care providers. The initial encouraging results suggest that future trials in other types of practices will be even more encouraging.

Daugharty, G. D.



Probabilistic methods for rotordynamics analysis  

NASA Technical Reports Server (NTRS)

This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.



Best Practices for Convolutional Neural Networks Applied to Visual Document Analysis  

Microsoft Academic Search

Neural networks are a powerful technology for classification of visual inputs arising from documents. However, there is a confusing plethora of different neural network methods that are used in the literature and in industry. This paper describes a set of concrete best practices that document analysis researchers can use to get good results with neural networks. The most important practice

Patrice Y. Simard; David Steinkraus; John C. Platt



Practical Application of Second Law Efficiency Analysis  

E-print Network

. efforts in this century to popularize its practical use (see, for examples. the classical engineering thermodynamics texts of Goodenough (3). Keenan (4). and Dodge (5? have met with limited acceptance. Available energy. henceforth called exergy. is a... property which measures an object's maximum capacity to cause change. a capacity which exists because the substance is not in complete. stable equilibrium. (Different authors have presented the concept. exergy, with a variety of names: available...

Gaggioli, R. A.; Wepfer, W. J.



Practical Calculation Method of Circulating Current Loss for Large Turbine Generator Designs  

NASA Astrophysics Data System (ADS)

In this paper, the authors describe a practical calculation method of armature strand current distributions and circulating current losses for large turbine generators. A chain of analysis is made up of automatic mesh generation, a quasi three-dimensional linkage flux calculation and an armature strand network calculation. The calculated total armature coil losses for a 200MVA class turbine generator shows good agreement with detailed calculation results obtained by full three-dimensional magnetic field analysis.

Ide, Kazumasa; Takahashi, Kazuhiko; Hattori, Ken'Ichi; Motoi, Naganori; Furukawa, Katsuya; Watanaba, Takashi


Situational Analysis: Centerless Systems and Human Service Practices  

Microsoft Academic Search

Bronfenbrenner's ecological model is a conceptual framework that continues to contribute to human service practices. In the current article, the author describes the possibilities for practice made intelligible by drawing from this framework. She then explores White's “Web of Praxis” model as an important extension of this approach, and proceeds to offer Clarke's “Situational Analysis” as another fruitful tool for

Janet Newbury



The Role of Data Analysis in Inclusion Processes and Practices  

ERIC Educational Resources Information Center

This article outlines BPRS funded investigations into my own school's and other establishments' practices and processes within data analysis, needs identification and tracking of children's academic progress. It describes the evaluation of my school's then current practices and policies and the use of questionnaires and semi-structured interviews…

Bell, Lilian Fleur



A practical gait analysis system using gyroscopes  

Microsoft Academic Search

This study investigated the possibility of using uni-axial gyroscopes to develop a simple portable gait analysis system. Gyroscopes were attached on the skin surface of the shank and thigh segments and the angular velocity for each segment was recorded in each segment. Segment inclinations and knee angle were derived from segment angular velocities. The angular signals from a motion analysis

Kaiyu Tong; Malcolm H Granat



Practical Analysis of Gadget Framework on Android OS  

E-print Network

Practical Analysis of Gadget Framework on Android OS Studienarbeit im Rahmen des Diplomstudiengangs on Android 4 2.1 USB Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.1.1 Topology . . . . . . . . . . . . . . . . . . . . . 12 2.2.5 Gadgetfs . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.3 Embedded Android


[Mother Kangaroo Method: an investigation about the domestic practice].  


This is a descriptive study, within uantitative approach, aiming at acquiring knowledge regarding the domestic practice of Mother Kangaroo Method. Data were collected from a survey for the parents of prematures hospitalized in a University Hospital in São Luís, Maranhão State, from May to August, 2005. According to the findings, 100% of the families received training and guidance in the hospital and only in 53.3% of the cases the mothers were guided. The benefits of the education works developed by the team were confirmed by domestic practice with 93.3% of the mothers performing the kangaroo position correctly, where 86.7% of the babies were slightly dressed, 86.7% of the mothers were breast feeding technically in a correct manner and 86.7% without any other items being used. 46.7% of the mothers stay 5 to 8 hours/day with their babies in this position and 66.7% identified house tasks as the principal obstacle of the practice. Regarding neonatal walk-in unit follow-up 63.3% of the mothers identified the lack of financial recourses to pay for transportation as the main difficulty factor. The data obtained show that the support from the family network and the health team seem to be the best way to guarantee the extension of domestic care. PMID:20169256

de Araújo, Cristiane Luciana; Rios, Cláudia Teresa Frias; dos Santos, Marinese Hermínia; Gonçalves, Anna Paula Ferrario



Groundwater and soil remediation: Practical methods and strategies  

SciTech Connect

The author presents his latest compilation of time- and cost-saving techniques, methods, and strategies for soil and groundwater remediation. Discussions include: in situ remediations; monitoring; designing full-scale treatment systems; trichloroethylene treatment; aquifer restoration; negotiating with regulators; and bioremediation. It covers many subjects--from commentary to advanced technologies. The book focuses on cost savings throughout the text, comparing costs between technologies and presenting practical approaches for the entire remediation process. An accessible understanding to improve the remediation design rather than a rigorous data presentation is provided. This book presents an exceptional summary of several studies on bioremediation.

Nyer, E.K. [ed.



Development of a practical costing method for hospitals.  


To realize an effective cost control, a practical and accurate cost accounting system is indispensable in hospitals. In traditional cost accounting systems, the volume-based costing (VBC) is the most popular cost accounting method. In this method, the indirect costs are allocated to each cost object (services or units of a hospital) using a single indicator named a cost driver (e.g., Labor hours, revenues or the number of patients). However, this method often results in rough and inaccurate results. The activity based costing (ABC) method introduced in the mid 1990s can prove more accurate results. With the ABC method, all events or transactions that cause costs are recognized as "activities", and a specific cost driver is prepared for each activity. Finally, the costs of activities are allocated to cost objects by the corresponding cost driver. However, it is much more complex and costly than other traditional cost accounting methods because the data collection for cost drivers is not always easy. In this study, we developed a simplified ABC (S-ABC) costing method to reduce the workload of ABC costing by reducing the number of cost drivers used in the ABC method. Using the S-ABC method, we estimated the cost of the laboratory tests, and as a result, similarly accurate results were obtained with the ABC method (largest difference was 2.64%). Simultaneously, this new method reduces the seven cost drivers used in the ABC method to four. Moreover, we performed an evaluation using other sample data from physiological laboratory department to certify the effectiveness of this new method. In conclusion, the S-ABC method provides two advantages in comparison to the VBC and ABC methods: (1) it can obtain accurate results, and (2) it is simpler to perform. Once we reduce the number of cost drivers by applying the proposed S-ABC method to the data for the ABC method, we can easily perform the cost accounting using few cost drivers after the second round of costing. PMID:16498229

Cao, Pengyu; Toyabe, Shin-Ichi; Akazawa, Kouhei



Comparative Lifecycle Energy Analysis: Theory and Practice.  

ERIC Educational Resources Information Center

Explores the position that more energy is conserved through recycling secondary materials than is generated from municipal solid waste incineration. Discusses one component of a lifecycle analysis--a comparison of energy requirements for manufacturing competing products. Includes methodological issues, energy cost estimates, and difficulties…

Morris, Jeffrey; Canzoneri, Diana



Practical Aspects of Krylov Subspace Iterative Methods in CFD  

NASA Technical Reports Server (NTRS)

Implementation issues associated with the application of Krylov subspace iterative methods, such as Newton-GMRES, are presented within the framework of practical computational fluid dynamic (CFD) applications. This paper categorizes, evaluates, and contrasts the major ingredients (function evaluations, matrix-vector products, and preconditioners) of Newton-GMRES Krylov subspace methods in terms of their effect on the local linear and global nonlinear convergence, memory requirements, and accuracy. The discussion focuses on Newton-GMRES in both a structured multi-zone incompressible Navier-Stokes solver and an unstructured mesh finite-volume Navier-Stokes solver. Approximate versus exact matrix-vector products, effective preconditioners, and other pertinent issues are addressed.

Pulliam, Thomas H.; Rogers, Stuart; Barth, Timothy



Theoretical and practical aspects of singularity and eigenmode expansion methods  

NASA Astrophysics Data System (ADS)

The singularity and eigenmode expansion methods which can identify a flying or stationary target from a transient field scattered by a target are analyzed. The basic starting points of the engineering problems solved by mathematical techniques involving computations of the Green's functions in diffraction and potential scattering theory are discussed along with the inverse problem of scattering star-like bodies which can be identified by singularity expansion methods. The unsolved mathematical problems which could aid scientists and engineers in practice include the solution of the Green's function for a convex smooth compact boundary and the information required on the geometry of an obstacle which can be obtained from the location of purely real poles.

Ramm, A. G.



Practical aspects of genome-wide association interaction analysis.  


Large-scale epistasis studies can give new clues to system-level genetic mechanisms and a better understanding of the underlying biology of human complex disease traits. Though many novel methods have been proposed to carry out such studies, so far only a few of them have demonstrated replicable results. Here, we propose a minimal protocol for genome-wide association interaction (GWAI) analysis to identify gene-gene interactions from large-scale genomic data. The different steps of the developed protocol are discussed and motivated, and encompass interaction screening in a hypothesis-free and hypothesis-driven manner. In particular, we examine a wide range of aspects related to epistasis discovery in the context of complex traits in humans, hereby giving practical recommendations for data quality control, variant selection or prioritization strategies and analytic tools, replication and meta-analysis, biological validation of statistical findings and other related aspects. The minimal protocol provides guidelines and attention points for anyone involved in GWAI analysis and aims to enhance the biological relevance of GWAI findings. At the same time, the protocol improves a better assessment of strengths and weaknesses of published GWAI methodologies. PMID:25164382

Gusareva, Elena S; Van Steen, Kristel



Chemical Analysis Modern Instrumentation Methods and Techniques  

E-print Network

#12;Chemical Analysis Modern Instrumentation Methods and Techniques Second Edition Francis Rouessac : modern instrumentation and methods and techniques / Francis Rouessac and Annick Rouessac ; translated #12;Chemical Analysis Second Edition #12;Chemical Analysis Modern Instrumentation Methods

Short, Daniel


Method of photon spectral analysis  


A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.



Methods for genetic linkage analysis using trisomies  

SciTech Connect

Certain genetic disorders (e.g. congenital cataracts, duodenal atresia) are rare in the general population, but more common in people with Down`s syndrome. We present a method for using individuals with trisomy 21 to map genes for such traits. Our methods are analogous to methods for mapping autosomal dominant traits using affected relative pairs by looking for markers with greater than expected identity-by-descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected reduction to homozygosity in the chromosomes inherited form the non-disjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the gene, a confidence interval for that distance, and methods for computing power and sample sizes. The methods are described in the context of gene-dosage model for the etiology of the disorder, but can be extended to other models. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers, how to test candidate genes, and how to handle the effect of reduced recombination associated with maternal meiosis I non-disjunction.

Feingold, E.; Lamb, N.E.; Sherman, S.L. [Emory Univ., Atlanta, GA (United States)



MAD Skills: New Analysis Practices for Big Data  

Microsoft Academic Search

As massive data acquisition and storage becomes increas- ingly aordable, a wide variety of enterprises are employing statisticians to engage in sophisticated data analysis. In this paper we highlight the emerging practice of Magnetic, Ag- ile, Deep (MAD) data analysis as a radical departure from traditional Enterprise Data Warehouses and Business Intel- ligence. We present our design philosophy, techniques and

Jeffrey Cohen; Brian Dolan; Mark Dunlap; Joseph M. Hellerstein; Caleb Welton



Comparison of four teaching methods on Evidence-based Practice skills of postgraduate nursing students.  


The aim of this study was to compare four teaching methods on the evidence-based practice knowledge and skills of postgraduate nursing students. Students enrolled in the Evidence-based Nursing (EBN) unit in Australia and Hong Kong in 2010 and 2011 received education via either the standard distance teaching method, computer laboratory teaching method, Evidence-based Practice-Digital Video Disc (EBP-DVD) teaching method or the didactic classroom teaching method. Evidence-based Practice (EBP) knowledge and skills were evaluated using student assignments that comprised validated instruments. One-way analysis of covariance was implemented to assess group differences on outcomes after controlling for the effects of age and grade point average (GPA). Data were obtained from 187 students. The crude mean score among students receiving the standard+DVD method of instruction was higher for developing a precise clinical question (8.1±0.8) and identifying the level of evidence (4.6±0.7) compared to those receiving other teaching methods. These differences were statistically significant after controlling for age and grade point average. Significant improvement in cognitive and technical EBP skills can be achieved for postgraduate nursing students by integrating a DVD as part of the EBP teaching resources. The EBP-DVD is an easy teaching method to improve student learning outcomes and ensure that external students receive equivalent and quality learning experiences. PMID:23107585

Fernandez, Ritin S; Tran, Duong Thuy; Ramjan, Lucie; Ho, Carey; Gill, Betty



Practical semen analysis: from A to Z.  


Accurate semen analysis is critical for decisions about patient care, as well as for studies addressing overall changes in semen quality, contraceptive efficacy and effects of toxicant exposure. The standardization of semen analysis is very difficult for many reasons, including the use of subjective techniques with no standards for comparison, poor technician training, problems with proficiency testing and a reluctance to change techniques. The World Health Organization (WHO) Semen handbook (2010) offers a vastly improved set of standardized procedures, all at a level of detail that will preclude most misinterpretations. However, there is a limit to what can be learned from words and pictures alone. A WHO-produced DVD that offers complete demonstrations of each technique along with quality assurance standards for motility, morphology and concentration assessments would enhance the effectiveness of the manual. However, neither the manual nor a DVD will help unless there is general acknowledgement of the critical need to standardize techniques and rigorously pursue quality control to ensure that laboratories actually perform techniques 'according to WHO' instead of merely reporting that they have done so. Unless improvements are made, patient results will continue to be compromised and comparison between studies and laboratories will have limited merit. PMID:20111076

Brazil, Charlene



Methods of running gait analysis.  


The continued increase in running popularity has led to a subsequent increase in the need to assess running gait more easily and affordably. Although traditional measurement devices such as motion capture systems, force plates, and electromyography are adequate methods of gait analysis, they suffer from several limitations, such as expense and lack of portability. Recent technological advances have made available more viable options such as accelerometers, electrogoniometers, gyroscopes, and in-shoe pressure sensors. These sensors are being used more commonly to acquire the same information as the more traditional systems, without the associated limitations. Combined with wireless technology and/or data loggers, they provide an affordable, lightweight alternative to gait analysis, allowing data collection over prolonged periods of time in almost any environment. This article will review the current technologies used in the analysis of running gait, with a focus upon the latest developments and equipment. PMID:19436169

Higginson, Brian K



SAR/QSAR methods in public health practice  

SciTech Connect

Methods of (Quantitative) Structure-Activity Relationship ((Q)SAR) modeling play an important and active role in ATSDR programs in support of the Agency mission to protect human populations from exposure to environmental contaminants. They are used for cross-chemical extrapolation to complement the traditional toxicological approach when chemical-specific information is unavailable. SAR and QSAR methods are used to investigate adverse health effects and exposure levels, bioavailability, and pharmacokinetic properties of hazardous chemical compounds. They are applied as a part of an integrated systematic approach in the development of Health Guidance Values (HGVs), such as ATSDR Minimal Risk Levels, which are used to protect populations exposed to toxic chemicals at hazardous waste sites. (Q)SAR analyses are incorporated into ATSDR documents (such as the toxicological profiles and chemical-specific health consultations) to support environmental health assessments, prioritization of environmental chemical hazards, and to improve study design, when filling the priority data needs (PDNs) as mandated by Congress, in instances when experimental information is insufficient. These cases are illustrated by several examples, which explain how ATSDR applies (Q)SAR methods in public health practice.

Demchuk, Eugene, E-mail:; Ruiz, Patricia; Chou, Selene; Fowler, Bruce A.



Flow methods in chiral analysis.  


The methods used for the separation and analytical determination of individual isomers are based on interactions with substances exhibiting optical activity. The currently used methods for the analysis of optically active compounds are primarily high-performance separation methods, such as gas and liquid chromatography using chiral stationary phases or chiral selectors in the mobile phase, and highly efficient electromigration techniques, such as capillary electrophoresis using chiral selectors. Chemical sensors and biosensors may also be designed for the analysis of optically active compounds. As enantiomers of the same compound are characterised by almost identical physico-chemical properties, their differentiation/separation in one-step unit operation in steady-state or dynamic flow systems requires the use of highly effective chiral selectors. Examples of such determinations are reviewed in this paper, based on 105 references. The greatest successes for isomer determination involve immunochemical interactions, enantioselectivity of the enzymatic biocatalytic processes, and interactions with ion-channel receptors or molecularly imprinted polymers. Conducting such processes under dynamic flow conditions may significantly enhance the differences in the kinetics of such processes, leading to greater differences in the signals recorded for enantiomers. Such determinations in flow conditions are effectively performed using surface-plasmon resonance and piezoelectric detections, as well as using common spectroscopic and electrochemical detections. PMID:24139575

Trojanowicz, Marek; Kaniewska, Marzena



Voltammetric analysis apparatus and method  


An apparatus and method is described for electrochemical analysis of elements in solution. An auxiliary electrode, a reference electrode, and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and the auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between the auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in the sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

Almon, A.C.



Voltametric analysis apparatus and method  


An apparatus and method for electrochemical analysis of elements in solution. An auxiliary electrode 14, a reference electrode 18, and five working electrodes 20, 22, 26, 28, and 30 are positioned in a container 12 containing a sample solution 34. The working electrodes are spaced apart evenly from each other and auxiliary electrode 14 to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between auxiliary electrode 14 and each of the working electrodes 20, 22, 26, 28, and 30. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in sample solution 34 and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

Almon, Amy C. (410 Waverly Dr., Augusta, GA 30909)



Practical evaluation of Mung bean seed pasteurization method in Japan.  


The majority of the seed sprout-related outbreaks have been associated with Escherichia coli O157:H7 and Salmonella. Therefore, an effective method for inactivating these organisms on the seeds before sprouting is needed. The current pasteurization method for mung beans in Japan (hot water treatment at 85 degrees C for 10 s) was more effective for disinfecting inoculated E. coli O157:H7, Salmonella, and nonpathogenic E. coli on mung bean seeds than was the calcium hypochlorite treatment (20,000 ppm for 20 min) recommended by the U.S. Food and Drug Administration. Hot water treatment at 85 degrees C for 40 s followed by dipping in cold water for 30 s and soaking in chlorine water (2,000 ppm) for 2 h reduced the pathogens to undetectable levels, and no viable pathogens were found in a 25-g enrichment culture and during the sprouting process. Practical tests using a working pasteurization machine with nonpathogenic E. coli as a surrogate produced similar results. The harvest yield of the treated seed was within the acceptable range. These treatments could be a viable alternative to the presently recommended 20,000-ppm chlorine treatment for mung bean seeds. PMID:20377967

Bari, M L; Enomoto, K; Nei, D; Kawamoto, S



Key steps in the strategic analysis of a dental practice.  


As dentistry is becoming increasingly competitive, dentists must focus more on strategic analysis. This paper lays out seven initial steps that are the foundation of strategic analysis. It introduces and describes the use of service-customer matrices and location-proximity maps as tools in competitive positioning. The paper also contains a brief overview of the role of differentiation and cost-control in determining key success factors for dental practices. PMID:11066715

Armstrong, J L; Boardman, A E; Vining, A R



A Deliberate Practice Approach to Teaching Phylogenetic Analysis  

ERIC Educational Resources Information Center

One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we…

Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.



Practicing oncology in provincial Mexico: A narrative analysis  

Microsoft Academic Search

This paper examines the discourse of oncologists treating cancer in a provincial capital of southern Mexico. Based on an analysis of both formal interviews and observations of everyday clinical practice, it examines a set of narrative themes they used to maintain a sense of professionalism and possibility as they endeavored to apply a highly technologically dependent biomedical model in a

Linda M. Hunt



Reconstructing northern Chinese Neolithic subsistence practices by isotopic analysis  

E-print Network

high carbon isotope ratios (d13 C Z ÿ7.7 G 0.4&) and low nitrogen isotope ratios (d15 N Z 7.5 G 0Reconstructing northern Chinese Neolithic subsistence practices by isotopic analysis Ekaterina A, Queens College of the City University of New York, Kissena Boulevard, Flushing, NY 11367, USA b New York

Pechenkina, Ekaterina


Analysis of Practice Interviews of Medical Students with Elderly Persons.  

ERIC Educational Resources Information Center

The process of interaction between students and well elderly patients in videotaped practice interviews was examined using a modification of Bales Interaction Process Analysis. The results indicate that time should be spent on demonstrating to students alternatives for responding to elderly patients and on stressing processes for using empathy.…

Prendergast, Christine; And Others



An epidemiological method applied to practices to measure the representativeness of their prescribing characteristics  

Microsoft Academic Search

The standardised report of the Prescription Pricing Authority, which is concerned with the prescribing characteristics of practices, was used as an epidemiological tool to evaluate the prescribing representativeness of practices. Study practices were compared with average prescribing results from family practitioner committees, which are specific for the geographical district and month sampled. The method was applied in 40 practices, representing

D M Fleming



Computational methods for global/local analysis  

NASA Technical Reports Server (NTRS)

Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.



Stated preference analysis of travel choices: the state of practice  

Microsoft Academic Search

Stated preference (SP) methods are widely used in travel behaviour research and practice to identify behavioural responses to choice situations which are not revealed in the market, and where the attribute levels offered by existing choices are modified to such an extent that the reliability of revealed preference models as predictors of response is brought into question. This paper reviews

David A. Hensher



Hybrid methods for rotordynamic analysis  

NASA Technical Reports Server (NTRS)

Effective procedures are presented for the response analysis of the Space Shuttle Main Engine turbopumps under transient loading conditions. Of particular concern is the determination of the nonlinear response of the systems to rotor imbalance in presence of bearing clearances. The proposed procedures take advantage of the nonlinearities involved being localized at only a few rotor/housing coupling joints. The methods include those based on integral formulations for the incremental solutions involving the transition matrices of the rotor and housing. Alternatively, a convolutional representation of the housing displacements at the coupling points is proposed which would allow performing the transient analysis on a reduced model of the housing. The integral approach is applied to small dynamical models to demonstrate the efficiency of the approach. For purposes of assessing the numerical integration results for the nonlinear rotor/housing systems, a numerical harmonic balance procedure is developed to enable determining all possible harmonic, subharmonic, and nonperiodic solutions of the systems. A brief account of the Fourier approach is presented as applied to a two degree of freedon rotor-support system.

Noah, Sherif T.



Meta-analysis: Historical Origins and Contemporary Practice.  

ERIC Educational Resources Information Center

The early and recent history of meta-analysis is outlined. After providing a definition of meta-analysis and listing its major characteristics, developments in statistics and research are described that influenced the formulation of modern meta-analytic methods. Major meta-analytic methods currently in use are described. Statistical and other…

Kulik, James A.; Kulik, Chen-Lin C.


A practical and sensitive method of quantitating lymphangiogenesis in vivo.  


To address the inadequacy of current assays, we developed a directed in vivo lymphangiogenesis assay (DIVLA) by modifying an established directed in vivo angiogenesis assay. Silicon tubes (angioreactors) were implanted in the dorsal flanks of nude mice. Tubes contained either growth factor-reduced basement membrane extract (BME)-alone (negative control) or BME-containing vascular endothelial growth factor (VEGF)-D (positive control for lymphangiogenesis) or FGF-2/VEGF-A (positive control for angiogenesis) or a high VEGF-D-expressing breast cancer cell line MDA-MD-468LN (468-LN), or VEGF-D-silenced 468LN. Lymphangiogenesis was detected superficially with Evans Blue dye tracing and measured in the cellular contents of angioreactors by multiple approaches: lymphatic vessel endothelial hyaluronan receptor-1 (Lyve1) protein (immunofluorescence) and mRNA (qPCR) expression and a visual scoring of lymphatic vs blood capillaries with dual Lyve1 (or PROX-11 or Podoplanin)/Cd31 immunostaining in cryosections. Lymphangiogenesis was absent with BME, high with VEGF-D or VEGF-D-producing 468LN cells and low with VEGF-D-silenced 468LN. Angiogenesis was absent with BME, high with FGF-2/VEGF-A, moderate with 468LN or VEGF-D and low with VEGF-D-silenced 468LN. The method was reproduced in a syngeneic murine C3L5 tumor model in C3H/HeJ mice with dual Lyve1/Cd31 immunostaining. Thus, DIVLA presents a practical and sensitive assay of lymphangiogenesis, validated with multiple approaches and markers. It is highly suited to identifying pro- and anti-lymphangiogenic agents, as well as shared or distinct mechanisms regulating lymphangiogenesis vs angiogenesis, and is widely applicable to research in vascular/tumor biology. PMID:23711825

Majumder, Mousumi; Xin, Xiping; Lala, Peeyush K



Medical audit cycle. A review of methods and research in clinical practice.  


The aim of medical audit is to improve the effectiveness and efficiency of medical care. Achieving this aim may involve a cycle of activities: (i) observing practice; (ii) setting a standard of practice; (iii) comparing the observed practice with the standard; (iv) implementing change; and (v) re-observing practice. This paper reviews those methods which have been used and tested in clinical practice at the various stages of the cycle. Some principles involved in the choice of what to audit are also described. As definite cost effective methods of auditing have not yet emerged, further initiatives and experimentation are required in clinical practice. PMID:6750333

Fowkes, F G



Introduction to Network Analysis 1 Multiscale Methods  

E-print Network

Introduction to Network Analysis 1 Multiscale Methods In many complex systems, a big scale gap can, biological, mathematical, etc.) models and/or laws at different scales. #12;Introduction to Network Analysis;Introduction to Network Analysis 6 Computational Multiscale Methods Functional analysis at multiple resolutions

Safro, Ilya


The practical implementation of integrated safety management for nuclear safety analysis and fire hazards analysis documentation  

SciTech Connect

In 1995 Mr. Joseph DiNunno of the Defense Nuclear Facilities Safety Board issued an approach to describe the concept of an integrated safety management program which incorporates hazard and safety analysis to address a multitude of hazards affecting the public, worker, property, and the environment. Since then the U S . Department of Energy (DOE) has adopted a policy to systematically integrate safety into management and work practices at all levels so that missions can be completed while protecting the public, worker, and the environment. While the DOE and its contractors possessed a variety of processes for analyzing fire hazards at a facility, activity, and job; the outcome and assumptions of these processes have not always been consistent for similar types of hazards within the safety analysis and the fire hazard analysis. Although the safety analysis and the fire hazard analysis are driven by different DOE Orders and requirements, these analyses should not be entirely independent and their preparation should be integrated to ensure consistency of assumptions, consequences, design considerations, and other controls. Under the DOE policy to implement an integrated safety management system, identification of hazards must be evaluated and agreed upon to ensure that the public. the workers. and the environment are protected from adverse consequences. The DOE program and contractor management need a uniform, up-to-date reference with which to plan. budget, and manage nuclear programs. It is crucial that DOE understand the hazards and risks necessarily to authorize the work needed to be performed. If integrated safety management is not incorporated into the preparation of the safety analysis and the fire hazard analysis, inconsistencies between assumptions, consequences, design considerations, and controls may occur that affect safety. Furthermore, confusion created by inconsistencies may occur in the DOE process to grant authorization of the work. In accordance with the integrated safety management system approach for having a uniform and consistent process: a method has been suggested by the U S . Department of Energy at Richland and the Project Hanford Procedures when fire hazard analyses and safety analyses are required. This process provides for a common basis approach in the development of the fire hazard analysis and the safety analysis. This process permits the preparers of both documents to jointly participate in the development of the hazard analysis process. This paper presents this method to implement the integrated safety management approach in the development of the fire hazard analysis and safety analysis that provides consistency of assumptions. consequences, design considerations, and other controls necessarily to protect workers, the public. and the environment.




Practical Aspects of the Equation-Error Method for Aircraft Parameter Estimation  

NASA Technical Reports Server (NTRS)

Various practical aspects of the equation-error approach to aircraft parameter estimation were examined. The analysis was based on simulated flight data from an F-16 nonlinear simulation, with realistic noise sequences added to the computed aircraft responses. This approach exposes issues related to the parameter estimation techniques and results, because the true parameter values are known for simulation data. The issues studied include differentiating noisy time series, maximum likelihood parameter estimation, biases in equation-error parameter estimates, accurate computation of estimated parameter error bounds, comparisons of equation-error parameter estimates with output-error parameter estimates, analyzing data from multiple maneuvers, data collinearity, and frequency-domain methods.

Morelli, Eugene a.



Coal Field Fire Fighting - Practiced methods, strategies and tactics  

NASA Astrophysics Data System (ADS)

Subsurface coal fires destroy millions of tons of coal each year, have an immense impact to the ecological surrounding and threaten further coal reservoirs. Due to enormous dimensions a coal seam fire can develop, high operational expenses are needed. As part of the Sino-German coal fire research initiative "Innovative technologies for exploration, extinction and monitoring of coal fires in Northern China" the research team of University of Wuppertal (BUW) focuses on fire extinction strategies and tactics as well as aspects of environmental and health safety. Besides the choice and the correct application of different extinction techniques further factors are essential for the successful extinction. Appropriate tactics, well trained and protected personnel and the choice of the best fitting extinguishing agents are necessary for the successful extinction of a coal seam fire. The chosen strategy for an extinction campaign is generally determined by urgency and importance. It may depend on national objectives and concepts of coal conservation, on environmental protection (e.g. commitment to green house gases (GHG) reductions), national funding and resources for fire fighting (e.g. personnel, infrastructure, vehicles, water pipelines); and computer-aided models and simulations of coal fire development from self ignition to extinction. In order to devise an optimal fire fighting strategy, "aims of protection" have to be defined in a first step. These may be: - directly affected coal seams; - neighboring seams and coalfields; - GHG emissions into the atmosphere; - Returns on investments (costs of fire fighting compared to value of saved coal). In a further step, it is imperative to decide whether the budget shall define the results, or the results define the budget; i.e. whether there are fixed objectives for the mission that will dictate the overall budget, or whether the limited resources available shall set the scope within which the best possible results shall be achieved. For an effective and efficient fire fighting optimal tactics are requiered and can be divided into four fundamental tactics to control fire hazards: - Defense (digging away the coal, so that the coal can not begin to burn; or forming a barrier, so that the fire can not reach the not burning coal), - Rescue the coal (coal mining of a not burning seam), - Attack (active and direct cooling of burning seam), - Retreat (only monitoring till self-extinction of a burning seam). The last one is used when a fire exceeds the organizational and/or technical scope of a mission. In other words, "to control a coal fire" does not automatically and in all situations mean "to extinguish a coal fire". Best-practice tactics or a combination of them can be selected for control of a particular coal fire. For the extinguishing works different extinguishing agents are available. They can be applied by different application techniques and varying distinctive operating expenses. One application method may be the drilling of boreholes from the surface or covering the surface with low permeability soils. The mainly used extinction agents for coal field fire are as followed: Water (with or without additives), Slurry, Foaming mud/slurry, Inert gases, Dry chemicals and materials and Cryogenic agents. Because of its tremendous dimension and its complexity the worldwide challenge of coal fires is absolutely unique - it can only be solved with functional application methods, best fitting strategies and tactics, organisation and research as well as the dedication of the involved fire fighters, who work under extreme individual risks on the burning coal fields.

Wündrich, T.; Korten, A. A.; Barth, U. H.



Neuroscience and the Feldenkrais Method: evidence in research and clinical practice  

E-print Network

@ Some say evidence-based practice stifles the creative therapies and learning modalitiesNeuroscience and the Feldenkrais Method: evidence in research and clinical practice Associate. It draws on principles of exploratory practice rather than prescribed exercises and can work at different

Hickman, Mark


Learning by the Case Method: Practical Approaches for Community Leaders.  

ERIC Educational Resources Information Center

This supplement to Volunteer Training and Development: A Manual for Community Groups, provides practical guidance in the selection, writing, and adaptation of effective case materials for specific educational objectives, and develops suitable cases for use by analyzing concrete situations and by offering illustrations of various types. An…

Stenzel, Anne K.; Feeney, Helen M.


Parallel Processable Cryptographic Methods with Unbounded Practical Security.  

ERIC Educational Resources Information Center

Addressing the problem of protecting confidential information and data stored in computer databases from access by unauthorized parties, this paper details coding schemes which present such astronomical work factors to potential code breakers that security breaches are hopeless in any practical sense. Two procedures which can be used to encode for…

Rothstein, Jerome


Efficient methods and practical guidelines for simulating isotope effects.  


The shift in chemical equilibria due to isotope substitution is frequently exploited to obtain insight into a wide variety of chemical and physical processes. It is a purely quantum mechanical effect, which can be computed exactly using simulations based on the path integral formalism. Here we discuss how these techniques can be made dramatically more efficient, and how they ultimately outperform quasi-harmonic approximations to treat quantum liquids not only in terms of accuracy, but also in terms of computational cost. To achieve this goal we introduce path integral quantum mechanics estimators based on free energy perturbation, which enable the evaluation of isotope effects using only a single path integral molecular dynamics trajectory of the naturally abundant isotope. We use as an example the calculation of the free energy change associated with H/D and (16)O/(18)O substitutions in liquid water, and of the fractionation of those isotopes between the liquid and the vapor phase. In doing so, we demonstrate and discuss quantitatively the relative benefits of each approach, thereby providing a set of guidelines that should facilitate the choice of the most appropriate method in different, commonly encountered scenarios. The efficiency of the estimators we introduce and the analysis that we perform should in particular facilitate accurate ab initio calculation of isotope effects in condensed phase systems. PMID:23298033

Ceriotti, Michele; Markland, Thomas E



Practice patterns in FNA technique: A survey analysis  

PubMed Central

AIM: To ascertain fine needle aspiration (FNA) techniques by endosonographers with varying levels of experience and environments. METHODS: A survey study was performed on United States based endosonographers. The subjects completed an anonymous online electronic survey. The main outcome measurements were differences in needle choice, FNA technique, and clinical decision making among endosonographers and how this relates to years in practice, volume of EUS-FNA procedures, and practice environment. RESULTS: A total of 210 (30.8%) endosonographers completed the survey. Just over half (51.4%) identified themselves as academic/university-based practitioners. The vast majority of respondents (77.1%) identified themselves as high-volume endoscopic ultrasound (EUS) (> 150 EUS/year) and high-volume FNA (> 75 FNA/year) performers (73.3). If final cytology is non-diagnostic, high-volume EUS physicians were more likely than low volume physicians to repeat FNA with a core needle (60.5% vs 31.2%; P = 0.0004), and low volume physicians were more likely to refer patients for either surgical or percutaneous biopsy, (33.4% vs 4.9%, P < 0.0001). Academic physicians were more likely to repeat FNA with a core needle (66.7%) compared to community physicians (40.2%, P < 0.001). CONCLUSION: There is significant variation in EUS-FNA practices among United States endosonographers. Differences appear to be related to EUS volume and practice environment. PMID:25324922

DiMaio, Christopher J; Buscaglia, Jonathan M; Gross, Seth A; Aslanian, Harry R; Goodman, Adam J; Ho, Sammy; Kim, Michelle K; Pais, Shireen; Schnoll-Sussman, Felice; Sethi, Amrita; Siddiqui, Uzma D; Robbins, David H; Adler, Douglas G; Nagula, Satish



Honesty in critically reflective essays: an analysis of student practice.  


In health professional education, reflective practice is seen as a potential means for self-improvement from everyday clinical encounters. This study aims to examine the level of student honesty in critical reflection, and barriers and facilitators for students engaging in honest reflection. Third year physiotherapy students, completing summative reflective essays on clinical encounters using the modified Gibbs cycle, were invited to participate in an anonymous online survey. Student knowledge and beliefs about reflective practice, and disclosure of the truthfulness of their reflections, were assessed using a mixed method approach. A total of 34 students, from a maximum possible of 48 (71 %), participated in the study activities. A total of 68 % stated that they were at least 80 % truthful about their experiences. There was general student consensus that reflective practice was important for their growth as a clinician. Students questioned the belief that the reflection needed to be based on a factual experience. Reflective practice can be a valuable addition to the clinical education of health care professionals, although this value can be diminished through dishonest reflections if it is not carefully implemented. Student influences on honest reflection include; (1) the design of any assessment criteria, and (2) student knowledge and competency in applying critical reflection. PMID:22926807

Maloney, Stephen; Tai, Joanna Hong-Meng; Lo, Kristin; Molloy, Elizabeth; Ilic, Dragan



Polydispersity analysis of Taylor dispersion data: the cumulant method  

E-print Network

Taylor dispersion analysis is an increasingly popular characterization method that measures the diffusion coefficient, and hence the hydrodynamic radius, of (bio)polymers, nanoparticles or even small molecules. In this work, we describe an extension to current data analysis schemes that allows size polydispersity to be quantified for an arbitrary sample, thereby significantly enhancing the potentiality of Taylor dispersion analysis. The method is based on a cumulant development similar to that used for the analysis of dynamic light scattering data. Specific challenges posed by the cumulant analysis of Taylor dispersion data are discussed, and practical ways to address them are proposed. We successfully test this new method by analyzing both simulated and experimental data for solutions of moderately polydisperse polymers and polymer mixtures.

Luca Cipelletti; Jean-Philippe Biron; Michel Martin; Hervé Cottet



A Practical Method to Determine Influence Surfaces using Commercial Software  

Microsoft Academic Search

Structures are subject to dead load and live load; the latter refers to loads which would act on various possible locations.\\u000a As such, it is very often necessary for practicing structural engineers to determine how stresses vary due to live loads acting\\u000a on different locations of the structure. For simple structures and simple loadings, engineers could assign live loads to

J. Kong


A practical introduction to multivariate meta-analysis.  


Multivariate meta-analysis is becoming increasingly popular and official routines or self-programmed functions have been included in many statistical software. In this article, we review the statistical methods and the related software for multivariate meta-analysis. Emphasis is placed on Bayesian methods using Markov chain Monte Carlo, and codes in WinBUGS are provided. The various model-fitting options are illustrated in two examples and specific guidance is provided on how to run a multivariate meta-analysis using various software packages. PMID:22275379

Mavridis, Dimitris; Salanti, Georgia



Standard Practice for Analysis and Interpretation of Light-Water Reactor Surveillance Results, E706(IA)  

E-print Network

1.1 This practice covers the methodology, summarized in Annex A1, to be used in the analysis and interpretation of neutron exposure data obtained from LWR pressure vessel surveillance programs; and, based on the results of that analysis, establishes a formalism to be used to evaluate present and future condition of the pressure vessel and its support structures (1-70). 1.2 This practice relies on, and ties together, the application of several supporting ASTM standard practices, guides, and methods (see Master Matrix E 706) (1, 5, 13, 48, 49). In order to make this practice at least partially self-contained, a moderate amount of discussion is provided in areas relating to ASTM and other documents. Support subject areas that are discussed include reactor physics calculations, dosimeter selection and analysis, and exposure units. Note 1—(Figure 1 is deleted in the latest update. The user is refered to Master Matrix E 706 for the latest figure of the standards interconnectivity). 1.3 This practice is restri...

American Society for Testing and Materials. Philadelphia



Instrumental Methods of Chemical Analysis  

NSDL National Science Digital Library

This site includes resources for the instrumental analysis class at St Olaf's College. The syllabus, a sample exam, problem sets, a class calendar, and an introduction to the use of role playing in the class are provided.

Walters, John P.



A Practical Guide to Interpretation of Large Collections of Incident Narratives Using the QUORUM Method  

NASA Technical Reports Server (NTRS)

Analysis of incident reports plays an important role in aviation safety. Typically, a narrative description, written by a participant, is a central part of an incident report. Because there are so many reports, and the narratives contain so much detail, it can be difficult to efficiently and effectively recognize patterns among them. Recognizing and addressing recurring problems, however, is vital to continuing safety in commercial aviation operations. A practical way to interpret large collections of incident narratives is to apply the QUORUM method of text analysis, modeling, and relevance ranking. In this paper, QUORUM text analysis and modeling are surveyed, and QUORUM relevance ranking is described in detail with many examples. The examples are based on several large collections of reports from the Aviation Safety Reporting System (ASRS) database, and a collection of news stories describing the disaster of TWA Flight 800, the Boeing 747 which exploded in mid- air and crashed near Long Island, New York, on July 17, 1996. Reader familiarity with this disaster should make the relevance-ranking examples more understandable. The ASRS examples illustrate the practical application of QUORUM relevance ranking.

McGreevy, Michael W.



Estimating free-living human energy expenditure: Practical aspects of the doubly labeled water method and its applications  

PubMed Central

The accuracy and noninvasive nature of the doubly labeled water (DLW) method makes it ideal for the study of human energy metabolism in free-living conditions. However, the DLW method is not always practical in many developing and Asian countries because of the high costs of isotopes and equipment for isotope analysis as well as the expertise required for analysis. This review provides information about the theoretical background and practical aspects of the DLW method, including optimal dose, basic protocols of two- and multiple-point approaches, experimental procedures, and isotopic analysis. We also introduce applications of DLW data, such as determining the equations of estimated energy requirement and validation studies of energy intake. PMID:24944767

Kazuko, Ishikawa-Takata; Kim, Eunkyung; Kim, Jeonghyun; Yoon, Jinsook



On Practical Results of the Differential Power Analysis  

NASA Astrophysics Data System (ADS)

This paper describes practical differential power analysis attacks. There are presented successful and unsuccessful attack attempts with the description of the attack methodology. It provides relevant information about oscilloscope settings, optimization possibilities and fundamental attack principles, which are important when realizing this type of attack. The attack was conducted on the PIC18F2420 microcontroller, using the AES cryptographic algorithm in the ECB mode with the 128-bit key length. We used two implementations of this algorithm - in the C programming language and in the assembler.

Breier, Jakub; Kleja, Marcel



Analysis of flight equipment purchasing practices of representative air carriers  

NASA Technical Reports Server (NTRS)

The process through which representative air carriers decide whether or not to purchase flight equipment was investigated as well as their practices and policies in retiring surplus aircraft. An analysis of the flight equipment investment decision process in ten airlines shows that for the airline industry as a whole, the flight equipment investment decision is in a state of transition from a wholly informal process in earliest years to a much more organized and structured process in the future. Individual air carriers are in different stages with respect to the formality and sophistication associated with the flight equipment investment decision.



Root Cause Analysis: Methods and Mindsets.  

ERIC Educational Resources Information Center

This instructional unit is intended for use in training operations personnel and others involved in scram analysis at nuclear power plants in the techniques of root cause analysis. Four lessons are included. The first lesson provides an overview of the goals and benefits of the root cause analysis method. Root cause analysis techniques are covered…

Kluch, Jacob H.


Investigating the Efficacy of Practical Skill Teaching: A Pilot-Study Comparing Three Educational Methods  

ERIC Educational Resources Information Center

Effective education of practical skills can alter clinician behaviour, positively influence patient outcomes, and reduce the risk of patient harm. This study compares the efficacy of two innovative practical skill teaching methods, against a traditional teaching method. Year three pre-clinical physiotherapy students consented to participate in a…

Maloney, Stephen; Storr, Michael; Paynter, Sophie; Morgan, Prue; Ilic, Dragan



Assessing Student Perception of Practice Evaluation Knowledge in Introductory Research Methods  

ERIC Educational Resources Information Center

The authors explored the use of the Practice Evaluation Knowledge Scale (PEKS) to assess student perception of acquisition and retention of practice evaluation knowledge from an undergraduate research methods class. The authors sampled 2 semesters of undergraduate social work students enrolled in an introductory research methods course.…

Baker, Lisa R.; Pollio, David E.; Hudson, Ashley



Scharz Preconditioners for Krylov Methods: Theory and Practice  

SciTech Connect

Several numerical methods were produced and analyzed. The main thrust of the work relates to inexact Krylov subspace methods for the solution of linear systems of equations arising from the discretization of partial di#11;erential equa- tions. These are iterative methods, i.e., where an approximation is obtained and at each step. Usually, a matrix-vector product is needed at each iteration. In the inexact methods, this product (or the application of a preconditioner) can be done inexactly. Schwarz methods, based on domain decompositions, are excellent preconditioners for thise systems. We contributed towards their under- standing from an algebraic point of view, developed new ones, and studied their performance in the inexact setting. We also worked on combinatorial problems to help de#12;ne the algebraic partition of the domains, with the needed overlap, as well as PDE-constraint optimization using the above-mentioned inexact Krylov subspace methods.

Szyld, Daniel B.



Propel: Tools and Methods for Practical Source Code Model Checking  

NASA Technical Reports Server (NTRS)

The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem



Good practices in LIBS analysis: Review and advices  

NASA Astrophysics Data System (ADS)

This paper presents a review on the analytical results obtained by laser-induced breakdown spectroscopy (LIBS). In the first part, results on identification and classification of samples are presented including the risk of misclassification, and in the second part, results on concentration measurement based on calibration are accompanied with significant figures of merit including the concept of accuracy. Both univariate and multivariate approaches are discussed with special emphasis on the methodology, the way of presenting the results and the assessment of the methods. Finally, good practices are proposed for both classification and concentration measurement.

El Haddad, J.; Canioni, L.; Bousquet, B.



Articulating current service development practices: a qualitative analysis of eleven mental health projects  

PubMed Central

Background The utilisation of good design practices in the development of complex health services is essential to improving quality. Healthcare organisations, however, are often seriously out of step with modern design thinking and practice. As a starting point to encourage the uptake of good design practices, it is important to understand the context of their intended use. This study aims to do that by articulating current health service development practices. Methods Eleven service development projects carried out in a large mental health service were investigated through in-depth interviews with six operation managers. The critical decision method in conjunction with diagrammatic elicitation was used to capture descriptions of these projects. Stage-gate design models were then formed to visually articulate, classify and characterise different service development practices. Results Projects were grouped into three categories according to design process patterns: new service introduction and service integration; service improvement; service closure. Three common design stages: problem exploration, idea generation and solution evaluation - were then compared across the design process patterns. Consistent across projects were a top-down, policy-driven approach to exploration, underexploited idea generation and implementation-based evaluation. Conclusions This study provides insight into where and how good design practices can contribute to the improvement of current service development practices. Specifically, the following suggestions for future service development practices are made: genuine user needs analysis for exploration; divergent thinking and innovative culture for idea generation; and fail-safe evaluation prior to implementation. Better training for managers through partnership working with design experts and researchers could be beneficial. PMID:24438471



Practical Considerations and Alternative Research Methods for Evaluating HR Programs  

Microsoft Academic Search

Program evaluation methods capitalizing on quasi-experimental designs are introduced as useful alternatives for evaluating human resources (HR) programs such as selection systems, training courses, performance measurement systems, and 360 degree feedback. A case study is presented to illustrate the benefits of program evaluation methods for evaluating HR programs. In addition, several post-hoc considerations that often moderate results from HR evaluations

Dale S. Rose; Karen E. Fiore



A Practice-Oriented Course on the Principles of Computation, Programming, and System Design and Analysis  

Microsoft Academic Search

\\u000a We propose a simple foundation for a practice-oriented undergraduate course that links seamlessly computation theory to principles\\u000a and methods for high-level computer-based system development and analysis. Starting from the fundamental notion of virtual\\u000a machine computations, which is phrased for both synchronous and asynchronous systems in terms of Abstract State Machines,\\u000a the course covers in a uniform way the basics of

Egon Börger



Analysis of the drop weight method  

NASA Astrophysics Data System (ADS)

The drop weight method is an accurate yet simple technique for determining surface tension ?. It relies on dripping a liquid of density ? at a low flow rate Q˜ from a capillary of radius R into air and measuring the combined volumes of the primary and satellite drops that are formed. The method's origin can be traced to Tate, who postulated that the volume ?ideal of the drop that falls from the capillary should be given by ?g?ideal=2?R?, where g is the gravitational acceleration. Since Tate's law is only an approximation and the actual drop volume ?fpractice the surface tension of the liquid-air interface is determined from the experimental master curve due to Harkins and Brown (HB). The master curve is a plot of the fraction of the ideal drop volume, ? ??f/?ideal, as a function of the dimensionless tube radius, ? ?R/?f1/3. Thus, once the actual drop volume ?f, and hence ?, is known, ? is readily calculated upon determining the value of ? from the master curve and that ? =?g?f/2?R?. Although HB proposed their master curve more than 80 years ago, a sound theoretical foundation for the drop weight method has heretofore been lacking. This weakness is remedied here by determining the dynamics of formation of many drops and their satellites in sequence by solving numerically the recently popularized one-dimensional (1-d) slender-jet equations. Computed solutions of the 1-d equations are shown to be in excellent agreement with HB's master curve when Q˜ is low. Moreover, a new theory of the drop weight method is developed using the computations and dimensional analysis. The latter reveals that there must exist a functional relationship between the parameter ?, where ?-3 is the dimensionless drop volume, and the gravitational Bond number G ??gR2/?, the Ohnesorge number Oh ??/(?R?)1/2, where ? is the viscosity, and the Weber number We ??Q˜2/?2R3?. When We ?0, the computed results show that ? depends solely on G. In this limit, a new correlation is deduced which has a simple functional form, G =3.60?2.81, and is more convenient to use than that of HB. The computed results are also used to show how the original drop weight method can be extended to situations where We is finite and resulting drop volumes are not independent of Oh.

Yildirim, Ozgur E.; Xu, Qi; Basaran, Osman A.



[Magnetocardiography in clinical practice: algorithms and technologies for data analysis].  


This methodological work is the first part of series of papers dedicated to the modern perspective method of non-invasive diagnostics in cardiology--magnetocardiography. Definition of magnetocardiography method is given, levels of magnetocardiography data analysis as well as electrophysiological models are described. The most informative biomarkers and technologies of qualitative and quantitative interpretation of current density distribution maps and curves of total current magnitude are presented. The step-by-step algorithm, which was used for the MCG-data analysis, is proposed. PMID:22416359

Cha?kovski?, I; Bo?chak, M; Sosnitski?, V; Miasnikov, G; Rykhlik, E; Sosnitskaia, T; Frolov, Iu; Budnik, V



Method of photon spectral analysis  

Microsoft Academic Search

A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples

R. J. Gehrke; M. H. Putnam; E. W. Killian; R. G. Helmer; R. L. Kynaston; S. G. Goodwin; L. O. Johnson



Moodtrack : practical methods for assembling emotion-driven music  

E-print Network

This thesis presents new methods designed for the deconstruction and reassembly of musical works based on a target emotional contour. Film soundtracks provide an ideal testing ground for organizing music around strict ...

Vercoe, G. Scott



Aircraft accidents : method of analysis  

NASA Technical Reports Server (NTRS)

The revised report includes the chart for the analysis of aircraft accidents, combining consideration of the immediate causes, underlying causes, and results of accidents, as prepared by the special committee, with a number of the definitions clarified. A brief statement of the organization and work of the special committee and of the Committee on Aircraft Accidents; and statistical tables giving a comparison of the types of accidents and causes of accidents in the military services on the one hand and in civil aviation on the other, together with explanations of some of the important differences noted in these tables.



Imaging Laser Analysis of Building MATERIALS—PRACTICAL Examples  

NASA Astrophysics Data System (ADS)

The Laser induced Breakdown Spectroscopy (LIBS) is supplement and extension of standard chemical methods and SEM- or Micro-RFA-applications for the evaluation of building materials. As a laboratory method LIBS is used to gain color coded images representing composition, distribution of characteristic ions and/or ingress characteristic of damaging substances. To create a depth profile of element concentration a core has to be taken and split along the core axis. LIBS was proven to be able to detect all important elements in concrete, e. g. Chlorine, Sodium or Sulfur, which are responsible for certain degradation mechanisms and also light elements like lithium or hydrogen. Practical examples are given and a mobile system for on-site measurements is presented.

Wilsch, G.; Schaurich, D.; Wiggenhauser, H.



Imaging laser analysis of building materials - practical examples  

SciTech Connect

The Laser induced Breakdown Spectroscopy (LIBS) is supplement and extension of standard chemical methods and SEM- or Micro-RFA-applications for the evaluation of building materials. As a laboratory method LIBS is used to gain color coded images representing composition, distribution of characteristic ions and/or ingress characteristic of damaging substances. To create a depth profile of element concentration a core has to be taken and split along the core axis. LIBS was proven to be able to detect all important elements in concrete, e. g. Chlorine, Sodium or Sulfur, which are responsible for certain degradation mechanisms and also light elements like lithium or hydrogen. Practical examples are given and a mobile system for on-site measurements is presented.

Wilsch, G.; Schaurich, D.; Wiggenhauser, H. [BAM, Federal Institute for Materials Research and Testing, Berlin (Germany)



Grounded action research: a method for understanding IT in practice  

Microsoft Academic Search

This paper shows how the theory development portion of action research can be made more rigorous. The process of theory formulation is an essential part of action research, yet this process is not well understood. A case study demonstrates how units of analysis and techniques from grounded theory can be integrated into the action research cycle in order to add

Richard Baskerville; Jan Pries-Heje



Theory, Method and Practice of Neuroscientific Findings in Science Education  

ERIC Educational Resources Information Center

This report provides an overview of neuroscience research that is applicable for science educators. It first offers a brief analysis of empirical studies in educational neuroscience literature, followed by six science concept learning constructs based on the whole brain theory: gaining an understanding of brain function; pattern recognition and…

Liu, Chia-Ju; Chiang, Wen-Wei



78 FR 69604 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...  

Federal Register 2010, 2011, 2012, 2013

...Manufacturing Practice and Hazard Analysis and Risk- Based Preventive...Manufacturing Practice and Hazard Analysis and Risk- Based Preventive...Center for Food Safety and Applied Nutrition (HFS-300), Food...Manufacturing Practice and Hazard Analysis and Risk-Based...



Accuracy analysis of Stewart platform based on interval analysis method  

NASA Astrophysics Data System (ADS)

A Stewart platform is introduced in the 500 m aperture spherical radio telescope(FAST) as an accuracy adjustable mechanism for feed receivers. Accuracy analysis is the basis of accuracy design. However, a rapid and effective accuracy analysis method for parallel manipulator is still needed. In order to enhance solution efficiency, an interval analysis method(IA method) is introduced to solve the terminal error bound of the Stewart platform with detailed solution path. Taking a terminal pose of the Stewart platform in FAST as an example, the terminal error is solved by the Monte Carlo method(MC method) by 4 980 s, the stochastic mathematical method(SM method) by 0.078 s, and the IA method by 2.203 s. Compared with MC method, the terminal error by SM method leads a 20% underestimate while the IA method can envelop the real error bound of the Stewart platform. This indicates that the IA method outperforms the other two methods by providing quick calculations and enveloping the real error bound of the Stewart platform. According to the given structural error of the dimension parameters of the Stewart platform, the IA method gives a maximum position error of 19.91 mm and maximum orientation error of 0.534°, which suggests that the IA method can be used for accuracy design of the Stewart platform in FAST. The IA method presented is a rapid and effective accuracy analysis method for Stewart platform.

Yao, Rui; Zhu, Wenbai; Huang, Peng



Practical method of diffusion-welding steel plate in air  

NASA Technical Reports Server (NTRS)

Method is ideal for critical service requirements where parent metal properties are equaled in notch toughness, stress rupture and other characteristics. Welding technique variations may be used on a variety of materials, such as carbon steels, alloy steels, stainless steels, ceramics, and reactive and refractory materials.

Holko, K. H.; Moore, T. J.



Educational Delivery Methods to Encourage Adoption of Sustainable Agricultural Practices.  

ERIC Educational Resources Information Center

Comparison of 143 farmers who attended sustainable agriculture conferences (76% response) with 143 controls (57% response) found no significant differences between the 2 groups, suggesting a need to change delivery methods for extension programming. Chemical dealers were the top source of information for both groups. (SK)

Gamon, Julia; And Others



Theoretical and practical aspects of singularity and eigenmode expansion methods  

Microsoft Academic Search

The singularity and eigenmode expansion methods which can identify a flying or stationary target from a transient field scattered by a target are analyzed. The basic starting points of the engineering problems solved by mathematical techniques involving computations of the Green's functions in diffraction and potential scattering theory are discussed along with the inverse problem of scattering star-like bodies which

A. G. Ramm



General practice fundholding: observations on prescribing patterns and costs using the defined daily dose method  

Microsoft Academic Search

OBJECTIVE--To compare prescribing patterns between a group of fundholding practices and a group of non-fundholding practices in north east Scotland using a method which provides more accurate statements about volumes prescribed than standard NHS statistics. DESIGN--The pharmacy practice division of the National Health Service in Scotland provided data for selected British National Formulary sections over two years. Each prescription issued

M Maxwell; D Heaney; J G Howie; S Noble



[Methods for detection of biofilm formation in routine microbiological practice].  


The increasing use of catheters, artificial implants and antimicrobials as well as high numbers of immunocompromised patients are major causes for concern over biofilm infections. These infections are characterized particularly by high resistance to antimicrobials and formation of persistent foci that may complicate therapy. Therefore, detection of biofilm formation is of high relevance to the clinician and his/her approach to the treatment. Reliable and sensitive methods for detection of this pathogenicity factor in clinically important organisms, suitable for use in routine microbiological laboratories, are needed for this purpose. Currently, a wide array of techniques are available for detection of this virulence factor, such as biofilm visualization by microscopy, culture detection, detection of particular components, detection of physical and chemical differences between biofilm-positive organisms and their planktonic forms and detection of genes responsible for biofilm formation. Since each of these methods has limitations, the best results can be achieved by combining different approaches. PMID:16528896

R?zicka, F; Holá, V; Votava, M



A Practical Method for the Synthesis of 2-Alkynylpropenals  

PubMed Central

A general method for the preparation of 2-alkynyl acroleins is described beginning with vinyl iodide 5 and involving a combination of Sonogashira coupling and Dess-Martin oxidation. Critical to the success of this approach is the use of a special workup procedure for the oxidation step. The resultant enynals participate in a variety of addition reactions including aldol condensations and reactions with organolithium compounds. PMID:15760233

Thongsornkleeb, Charnsak; Danheiser, Rick L.



Triangle area water supply monitoring project, October 1988 through September 2001, North Carolina -- description of the water-quality network, sampling and analysis methods, and quality-assurance practices  

USGS Publications Warehouse

The Triangle Area Water Supply Monitoring Project was initiated in October 1988 to provide long-term water-quality data for six area water-supply reservoirs and their tributaries. In addition, the project provides data that can be used to determine the effectiveness of large-scale changes in water-resource management practices, document differences in water quality among water-supply types (large multiuse reservoir, small reservoir, run-of-river), and tributary-loading and in-lake data for water-quality modeling of Falls and Jordan Lakes. By September 2001, the project had progressed in four phases and included as many as 34 sites (in 1991). Most sites were sampled and analyzed by the U.S. Geological Survey. Some sites were already a part of the North Carolina Division of Water Quality statewide ambient water-quality monitoring network and were sampled by the Division of Water Quality. The network has provided data on streamflow, physical properties, and concentrations of nutrients, major ions, metals, trace elements, chlorophyll, total organic carbon, suspended sediment, and selected synthetic organic compounds. Project quality-assurance activities include written procedures for sample collection, record management and archive, collection of field quality-control samples (blank samples and replicate samples), and monitoring the quality of field supplies. In addition to project quality-assurance activities, the quality of laboratory analyses was assessed through laboratory quality-assurance practices and an independent laboratory quality-control assessment provided by the U.S. Geological Survey Branch of Quality Systems through the Blind Inorganic Sample Project and the Organic Blind Sample Project.

Oblinger, Carolyn J.



A Function Analysis Model Based on Granular Computing and Practical Example in Product Data Management  

NASA Astrophysics Data System (ADS)

Effective product data management (PDM) platform is the critical environment to enhance the competitiveness of enterprises. In order to quickly establish a customer specified PDM platform for manufacturing enterprises, the guidance of effective methods is needed. This paper proposed a function analysis model based on granular computing to guide the establishment of PDM platform. This function analysis model can describe the dynamic mapping design process and the solution of design problem should be obtained through this process. To illuminate this model, a practical example of PDM in a company of auto parts manufacturing is provided.

Chi-lan, Cai; Yue-wei, Bai; Yan-chun, Xia; Xiao-gang, Wang; Kai, Liu


Perceived Barriers and Facilitators to School Social Work Practice: A Mixed-Methods Study  

ERIC Educational Resources Information Center

Understanding barriers to practice is a growing area within school social work research. Using a convenience sample of 284 school social workers, this study replicates the efforts of a mixed-method investigation designed to identify barriers and facilitators to school social work practice within different geographic locations. Time constraints and…

Teasley, Martell; Canifield, James P.; Archuleta, Adrian J.; Crutchfield, Jandel; Chavis, Annie McCullough



Comparison of detrending methods for fluctuation analysis in hydrology  

NASA Astrophysics Data System (ADS)

SummaryTrends within a hydrologic time series can significantly influence the scaling results of fluctuation analysis, such as rescaled range (RS) analysis and (multifractal) detrended fluctuation analysis (MF-DFA). Therefore, removal of trends is important in the study of scaling properties of the time series. In this study, three detrending methods, including adaptive detrending algorithm (ADA), Fourier-based method, and average removing technique, were evaluated by analyzing numerically generated series and observed streamflow series with obvious relative regular periodic trend. Results indicated that: (1) the Fourier-based detrending method and ADA were similar in detrending practices, and given proper parameters, these two methods can produce similarly satisfactory results; (2) detrended series by Fourier-based detrending method and ADA lose the fluctuation information at larger time scales, and the location of crossover points is heavily impacted by the chosen parameters of these two methods; and (3) the average removing method has an advantage over the other two methods, i.e., the fluctuation information at larger time scales is kept well-an indication of relatively reliable performance in detrending. In addition, the average removing method performed reasonably well in detrending a time series with regular periods or trends. In this sense, the average removing method should be preferred in the study of scaling properties of the hydrometeorolgical series with relative regular periodic trend using MF-DFA.

Zhang, Qiang; Zhou, Yu; Singh, Vijay P.; Chen, Yongqin David



Validation of analytical methods in compliance with good manufacturing practice: a practical approach  

PubMed Central

Background The quality and safety of cell therapy products must be maintained throughout their production and quality control cycle, ensuring their final use in the patient. We validated the Lymulus Amebocyte Lysate (LAL) test and immunophenotype according to International Conference on Harmonization Q2 Guidelines and the EU Pharmacopoeia, considering accuracy, precision, repeatability, linearity and range. Methods For the endotoxin test we used a kinetic chromogenic LAL test. As this is a limit test for the control of impurities, in compliance with International Conference on Harmonization Q2 Guidelines and the EU Pharmacopoeia, we evaluated the specificity and detection limit. For the immunophenotype test, an identity test, we evaluated specificity through the Fluorescence Minus One method and we repeated all experiments thrice to verify precision. The immunophenotype validation required a performance qualification of the flow cytometer using two types of standard beads which have to be used daily to check cytometer reproducibly set up. The results were compared together. Collected data were statistically analyzed calculating mean, standard deviation and coefficient of variation percentage (CV%). Results The LAL test is repeatable and specific. The spike recovery value of each sample was between 0.25 EU/ml and 1 EU/ml with a CV% < 10%. The correlation coefficient (? 0.980) and CV% (< 10%) of the standard curve tested in duplicate showed the test's linearity and a minimum detectable concentration value of 0.005 EU/ml. The immunophenotype method performed thrice on our cell therapy products is specific and repeatable as showed by CV% inter -experiment < 10%. Conclusions Our data demonstrated that validated analytical procedures are suitable as quality controls for the batch release of cell therapy products. Our paper could offer an important contribution for the scientific community in the field of CTPs, above all to small Cell Factories such as ours, where it is not always possible to have CFR21 compliant software. PMID:23981284



Practical Methods for Locating Abandoned Wells in Populated Areas  

SciTech Connect

An estimated 12 million wells have been drilled during the 150 years of oil and gas production in the United States. Many old oil and gas fields are now populated areas where the presence of improperly plugged wells may constitute a hazard to residents. Natural gas emissions from wells have forced people from their houses and businesses and have caused explosions that injured or killed people and destroyed property. To mitigate this hazard, wells must be located and properly plugged, a task made more difficult by the presence of houses, businesses, and associated utilities. This paper describes well finding methods conducted by the National Energy Technology Laboratory (NETL) that were effective at two small towns in Wyoming and in a suburb of Pittsburgh, Pennsylvania.

Veloski, G.A.; Hammack, R.W.; Lynn, R.J.



Transforming Practice the philosophies, methods and impacts of our engagements with landscape  

E-print Network

Transforming Practice the philosophies, methods and impacts of our engagements with landscape End theme. Participants in the workshops have represented a range of disciplines, including archaeology: The Symposium

Guo, Zaoyang


Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture  


Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

Sanfilippo, Antonio P. (Richland, WA); Cowell, Andrew J. (Kennewick, WA); Gregory, Michelle L. (Richland, WA); Baddeley, Robert L. (Richland, WA); Paulson, Patrick R. (Pasco, WA); Tratz, Stephen C. (Richland, WA); Hohimer, Ryan E. (West Richland, WA)



Maximizing Return From Sound Analysis and Design Practices  

SciTech Connect

With today's tightening budgets computer applications must provide "true" long-term benefit to the company. Businesses are spending large portions of their budgets "Re- Engineering" old systems to take advantage of "new" technology. But what they are really getting is simply a new interface implementing the same incomplete or poor defined requirements as before. "True" benefit can only be gained if sound analysis and design practices are used. WHAT data and processes are required of a system is not the same as HOW the system will be implemented within a company. It is the System Analyst's responsibility to understand the difference between these two concepts. The paper discusses some simple techniques to be used during the Analysis and Design phases of projects, as well as the information gathered and recorded in each phase and how it is transformed between these phases. The paper also covers production application generated using Oracle Designer. Applying these techniques to "real world" problems, the applications will meet the needs for today's business and adapt easily to ever-changing business environments.

Bramlette, Judith Lynn



Maximizing Return From Sound Analysis and Design Practices  

SciTech Connect

With today's tightening budgets computer applications must provide ''true'' long-term benefit to the company. Businesses are spending large portions of their budgets ''Re-Engineering'' old systems to take advantage of ''new'' technology. But what they are really getting is simply a new interface implementing the same incomplete or poor defined requirements as before. ''True'' benefit can only be gained if sound analysis and design practices are used. WHAT data and processes are required of a system is not the same as HOW the system will be implemented within a company. It is the System Analyst's responsibility to understand the difference between these two concepts. The paper discusses some simple techniques to be used during the Analysis and Design phases of projects, as well as the information gathered and recorded in each phase and how it is transformed between these phases. The paper also covers production application generated using Oracle Designer. Applying these techniques to ''real world'' problems, the applications will meet the needs for today's business and adapt easily to ever-changing business environments.

Bramlette, J.D.



A global analysis method for astrolabe observations  

NASA Astrophysics Data System (ADS)

In a previous paper (Chollet & Najid 1992), we gave the general principles of a new global method to analyze the astrolabe observations. The fundamental equation was obtained from the classical one in which the corrections to the star positions at the observational epoch are replaced by developments that contain the corrections to the star positions for the epoch of the catalogue, the proper motions, as well as the corrections to the precession and nutation constants. This computation gives us a new equation in which the coefficients contain only two variable parameters the azimuth and the sidereal time. The method proposed here consists in regarding the whole programme of star observations as only one group. All the possible values of the azimuth and also of the sidereal time are obtained and, in this case the column vectors of the coefficients are quite orthogonal, and the matirx of the normal equations is practically diagonal. The only problem which remains, is due to the variations of the apparent position of the station. These effects are removed by using the parameters of the Earth rotation given by the Bureau International de l'Huere (BIH) and connected to the International Earth Rotation Service (IERS) system by the Central Bureau of the IERS. Now, the new corrected unknowns are related to the mean position of the astrolabe, in the IERS system. The method to obtain absolute declinations follows the form of the preceding relations. The same error multiplied by different but known constants, affects the declination of each star, but also the latitude and zenith distance determinations. From this results, it is possible to find the well known result (Krejnin 1968) concerning the determinations of absolute declinations. But the comparison between the direct measurement and the result obtained from stellar observations will also give the systematic error in declination and latitude. The last important result is that the corrections to the precession and nutation constants appear in the equations without any perturbation due to the catalogue errors. This fact was seen in the past (Guinot 1970) but not used. The method given here does not use sophisticated methods to analyse the observational data obtained by astrolabes. Our purpose was preferably to combine and to correct the data. This method was elaborated to analyze the observations of the future automatic astrolabes. It has been tested on the series of observations obtained at Paris Observatory. analysis using the future Hipparcos catalogue.

Chollet, F.



A mixed methods approach to understand variation in lung cancer practice and the role of guidelines  

PubMed Central

Introduction Practice pattern data demonstrate regional variation and lower than expected rates of adherence to practice guideline (PG) recommendations for the treatment of stage II/IIIA resected and stage IIIA/IIIB unresected non-small cell lung cancer (NSCLC) patients in Ontario, Canada. This study sought to understand how clinical decisions are made for the treatment of these patients and the role of PGs. Methods Surveys and key informant interviews were undertaken with clinicians and administrators. Results Participants reported favorable ratings for PGs and the evidentiary bases underpinning them. The majority of participants agreed more patients should have received treatment and that regional variation is problematic. Participants estimated that up to 30% of patients are not good candidates for treatment and up to 20% of patients refuse treatment. The most common barrier to implementing PGs was the lack of organizational support by clinical administrative leadership. There was concern that the trial results underpinning the PG recommendations were not generalizable to the typical patients seen in clinic. The qualitative analysis yielded five themes related to physicians’ decision making: the unique patient, the unique physician, the family, the clinical team, and the clinical evidence. A dynamic interplay between these factors exists. Conclusion Our study demonstrates the challenges inherent in (i) the complexity of clinical decision making; (ii) how quality of care problems are perceived and operationalized; and (iii) the clinical appropriateness and utility of PG recommendations. We argue that systematic and rigorous methodologies to help decision makers mitigate or negotiate these challenges are warranted. PMID:24655753



Low hardness organisms: Culture methods, sensitivities, and practical applications  

SciTech Connect

EPA Regulations require biomonitoring of permitted effluent and stormwater runoff. Several permit locations were studied, in Virginia, that have supply water and or stormwater runoff which ranges in hardness from 5--30 mg/L. Ceriodaphnia dubia (dubia) and Pimephales promelas (fathead minnow) were tested in reconstituted water with hardnesses from 5--30 mg/L. Results indicated osmotic stresses present in the acute tests with the fathead minnow as well as chronic tests for the dubia and the fathead minnow. Culture methods were developed for both organism types in soft (30 mg) reconstituted freshwater. Reproductivity and development for each organisms type meets or exceeds EPA testing requirements for moderately hard organisms. Sensitivities were measured over an 18 month interval using cadmium chloride as a reference toxicant. Additionally, sensitivities were charted in contrast with those of organisms cultured in moderately hard water. The comparison proved that the sensitivities of both the dubia and the fathead minnow cultured in 30 mg water increased, but were within two standard deviations of the organism sensitivities of those cultured in moderately hard water. Latitude for use of organisms cultured in 30 mg was documented for waters ranging in hardness from 10--100 mg/L with no acclimation period required. The stability of the organism sensitivity was also validated. The application was most helpful in stormwater runoff and in effluents where the hardness was 30 mg/L or less.

DaCruz, A.; DaCruz, N.; Bird, M.



Methods for Analysis of Outdoor Performance Data (Presentation)  

SciTech Connect

The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted to power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders--utility companies, integrators, investors, and scientists alike. Different methods to determine degradation rates and discrete versus continuous data are presented, and some general best practice methods are outlined. In addition, historical degradation rates and some preliminary analysis with respect to climate are given.

Jordan, D.



Measuring solar reflectance - Part II: Review of practical methods  

SciTech Connect

A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23 ], and to within 0.02 for surface slopes up to 12:12 [45 ]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R{sub g,0}{sup *}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R{sub g,0}{sup *} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R{sub g,0}{sup *} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R{sub g,0}{sup *} to within about 0.01. (author)

Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul [Heat Island Group, Environmental Energy Technologies Division, Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States)



Airbreathing hypersonic vehicle design and analysis methods  

NASA Technical Reports Server (NTRS)

The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.



Improving educational environment in medical colleges through transactional analysis practice of teachers  

PubMed Central

Context: A FAIMER (Foundation for Advancement in International Medical Education and Research) fellow organized a comprehensive faculty development program to improve faculty awareness resulting in changed teaching practices and better teacher student relationships using Transactional Analysis (TA). Practicing TA tools help development of ‘awareness’ about intrapersonal and interpersonal processes. Objectives: To improve self-awareness among medical educators.To bring about self-directed change in practices among medical educators.To assess usefulness of TA tools for the same. Methods: An experienced trainer conducted a basic course (12 hours) in TA for faculty members. The PAC model of personality structure, functional fluency model of personal functioning, stroke theory on motivation, passivity and script theories of adult functional styles were taught experientially with examples from the Medical Education Scenario. Self-reported improvement in awareness and changes in practices were assessed immediately after, at three months, and one year after training. Findings: The mean improvement in self-'awareness' is 13.3% (95% C.I 9.3-17.2) among nineteen participants. This persists one year after training. Changes in practices within a year include, collecting feedback, new teaching styles and better relationship with students. Discussion and Conclusions: These findings demonstrate sustainable and measurable improvement in self-awareness by practice of TA tools. Improvement in self-'awareness' of faculty resulted in self-directed changes in teaching practices. Medical faculty has judged the TA tools effective for improving self-awareness leading to self-directed changes. PMID:24358808

Rajan, Marina



Laboratory theory and methods for sediment analysis  

USGS Publications Warehouse

The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.

Guy, Harold P.



Applications of Automation Methods for Nonlinear Fracture Test Analysis  

NASA Technical Reports Server (NTRS)

Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

Allen, Phillip A.; Wells, Douglas N.



Four hour ambulation after angioplasty is a safe practice method  

PubMed Central

BACKGROUND: During the last 3 decades, there were increasing tendency towards angioplasty because of its benefits. But, this procedure has its acute problems like bleeding and formation of hematoma in the removal place of the sheet. Based on researchers’ clinical experiences, patients need a time of 8-12 hours for bed rest after coronary angioplasty. Recognizing desirable time for bed rest after angioplasty and remove the arterial sheet forms the foundation of related researches in the world. Getting out of bed soon after angioplasty, causes more comfortable feelings, less hospitalization period, fewer side effects of prolonged bed rest and less hospitalization expenses. Regarding less time for bed rest after angioplasty, the aim of this study was to assess the effect of the time of getting out of bed after angioplasty on the complications after removing the sheet in coronary angioplasty patients. METHODS: This was an experimental clinical study conducted in one step and two groups. Samples were included 124 angioplasty patients (62 in each group) who were chosen randomly from the CCU of Shahid Chamran hospital of the Isfahan University of Medical Sciences in 2007. Data were gathered by observing and evaluating the patients, using a questionnaire and a checklist. After angioplasty, patients from the intervention group were taken out of bed in 4 hours and patients from the control group were taken out of bed in 8 hours. After taking out of bed, patients were examined for bleeding and formation of hematoma in the place of taking out the arterial sheet. Data were analyzed using descriptive and inferential statistics via SPSS software. RESULTS: Results showed no meaningful difference between the two groups after getting out of bed (p > 0.05) regarding relative frequency of bleeding (p = 0.50), formation of hematoma (p = 0.34) and average diameter of hematoma (p = 0.39). CONCLUSIONS: Results of this study showed that reducing the bed rest time to 4 hours after removing the arterial sheet of size 7 do not increase bleeding and formation of hematoma in the removal place of the sheet. So, those angioplasty patients who do not have critical clinical condition and their vital symptoms are stabilized will be able to get out of bed 4 hours after removing the sheet. PMID:21589772

Moeini, Mahin; Moradpour, Fatemeh; Babaei, Sima; Rafieian, Mohsen; Khosravi, Alireza



Causal Moderation Analysis Using Propensity Score Methods  

ERIC Educational Resources Information Center

This paper is based on previous studies in applying propensity score methods to study multiple treatment variables to examine the causal moderator effect. The propensity score methods will be demonstrated in a case study to examine the causal moderator effect, where the moderators are categorical and continuous variables. Moderation analysis is an…

Dong, Nianbo



Relating Actor Analysis Methods to Policy Problems  

Microsoft Academic Search

For a policy analyst the policy problem is the starting point for the policy analysis process. During this process the policy analyst structures the policy problem and makes a choice for an appropriate set of methods or techniques to analyze the problem (Goeller 1984). The methods of the policy analyst are often referred to as the toolbox or toolkit of

T. E. Van der Lei



Botulinum toxin type A treatment to the upper face: retrospective analysis of daily practice  

PubMed Central

Background Botulinum toxin type A treatment has been used for over 20 years to enhance the appearance of the face. There are several commercially available botulinum toxin type A products used in aesthetic clinical practice. The aim of this retrospective analysis was to compare the clinical efficacy of the most commonly used botulinum toxin type A preparations in daily practice. Methods Physicians from 21 centers in Germany completed questionnaires based on an inspection of subject files for subjects 18 years of age or over who had received at least two, but not more than three, consecutive treatments with incobotulinumtoxinA, onabotulinumtoxinA, or abobotulinumtoxinA within a 12-month period in the previous 2 years. Data on subject and physician satisfaction, treatment intervals, dosages, and safety were collected from 1256 subjects. Results There were no statistically significant differences between incobotulinumtoxinA and onabotulinumtoxinA with respect to physician and subject satisfaction, dosages, and adverse effects experienced. Both botulinum toxin type A preparations were well tolerated and effective in the treatment of upper facial lines. Due to low treatment numbers, abobotulinumtoxinA was not included in the statistical analysis. Conclusion The results of this retrospective analysis confirm the results of prospective clinical trials by demonstrating that, in daily practice, incobotulinumtoxinA and onabotulinumtoxinA are used at a 1:1 dose ratio and display comparable efficacy and safety. PMID:22791996

Prager, Welf; Huber-Vorlander, Jurgen; Taufig, A Ziah; Imhof, Matthias; Kuhne, Ulrich; Weissberg, Ruth; Kuhr, Lars-Peter; Rippmann, Volker; Philipp-Dormston, Wolfgang G; Proebstle, Thomas M; Roth, Claudia; Kerscher, Martina; Ulmann, Claudius; Pavicic, Tatjana



Organizational climate and hospital nurses' caring practices: a mixed-methods study.  


Organizational climate in healthcare settings influences patient outcomes, but its effect on nursing care delivery remains poorly understood. In this mixed-methods study, nurse surveys (N?=?292) were combined with a qualitative case study of 15 direct-care registered nurses (RNs), nursing personnel, and managers. Organizational climate explained 11% of the variation in RNs' reported frequency of caring practices. Qualitative data suggested that caring practices were affected by the interplay of organizational climate dimensions with patients and nurses characteristics. Workload intensity and role ambiguity led RNs to leave many caring practices to practical nurses and assistive personnel. Systemic interventions are needed to improve organizational climate and to support RNs' involvement in a full range of caring practices. PMID:24729389

Roch, Geneviève; Dubois, Carl-Ardy; Clarke, Sean P



Body politics : a Foucauldian discourse analysis of physiotherapy practice.  

E-print Network

??This thesis offers new insights into physiotherapy practice by asking 'how is physiotherapy discursively constructed?' Physiotherapy is a large, well-established, orthodox health profession. Recent changes… (more)

Nicholls, David A



Transonic wing analysis using advanced computational methods  

NASA Technical Reports Server (NTRS)

This paper discusses the application of three-dimensional computational transonic flow methods to several different types of transport wing designs. The purpose of these applications is to evaluate the basic accuracy and limitations associated with such numerical methods. The use of such computational methods for practical engineering problems can only be justified after favorable evaluations are completed. The paper summarizes a study of both the small-disturbance and the full potential technique for computing three-dimensional transonic flows. Computed three-dimensional results are compared to both experimental measurements and theoretical results. Comparisons are made not only of pressure distributions but also of lift and drag forces. Transonic drag rise characteristics are compared. Three-dimensional pressure distributions and aerodynamic forces, computed from the full potential solution, compare reasonably well with experimental results for a wide range of configurations and flow conditions.

Henne, P. A.; Hicks, R. M.



Methods for Evaluating Practice Change Toward a Patient-Centered Medical Home  

PubMed Central

PURPOSE Understanding the transformation of primary care practices to patient-centered medical homes (PCMHs) requires making sense of the change process, multilevel outcomes, and context. We describe the methods used to evaluate the country’s first national demonstration project of the PCMH concept, with an emphasis on the quantitative measures and lessons for multimethod evaluation approaches. METHODS The National Demonstration Project (NDP) was a group-randomized clinical trial of facilitated and self-directed implementation strategies for the PCMH. An independent evaluation team developed an integrated package of quantitative and qualitative methods to evaluate the process and outcomes of the NDP for practices and patients. Data were collected by an ethnographic analyst and a research nurse who visited each practice, and from multiple data sources including a medical record audit, patient and staff surveys, direct observation, interviews, and text review. Analyses aimed to provide real-time feedback to the NDP implementation team and lessons that would be transferable to the larger practice, policy, education, and research communities. RESULTS Real-time analyses and feedback appeared to be helpful to the facilitators. Medical record audits provided data on process-of-care outcomes. Patient surveys contributed important information about patient-rated primary care attributes and patient-centered outcomes. Clinician and staff surveys provided important practice experience and organizational data. Ethnographic observations supplied insights about the process of practice development. Most practices were not able to provide detailed financial information. CONCLUSIONS A multimethod approach is challenging, but feasible and vital to understanding the process and outcome of a practice development process. Additional longitudinal follow-up of NDP practices and their patients is needed. PMID:20530398

Jaen, Carlos Roberto; Crabtree, Benjamin F.; Palmer, Raymond F.; Ferrer, Robert L.; Nutting, Paul A.; Miller, William L.; Stewart, Elizabeth E.; Wood, Robert; Davila, Marivel; Stange, Kurt C.



Scholarship and practice: the contribution of ethnographic research methods to bridging the gap  

Microsoft Academic Search

Information systems research methods need to contribute to the scholarly requirements of the field of knowledge but also need to develop the potential to contribute to the practical requirements of practitioners? knowledge. This leads to possible conflicts in choosing research methods. Argues that the changing world of the IS practitioner is reflected in the changing world of the IS researcher

Lynda J. Harvey; Michael D. Myers



Researching "Practiced Language Policies": Insights from Conversation Analysis  

ERIC Educational Resources Information Center

In language policy research, "policy" has traditionally been conceptualised as a notion separate from that of "practice". In fact, language practices were usually analysed with a view to evaluate whether a policy is being implemented or resisted to. Recently, however, Spolsky in ("Language policy". Cambridge University press, Cambridge, 2004;…

Bonacina-Pugh, Florence



Practical 1P2 Young's Modulus and Stress Analysis  

E-print Network

This practical ties in with the lecture courses on elasticity. It will help you understand: 1. Hooke's law. Overview The objectives of this practical are 1. to demonstrate Hooke's law; 2. to determine the Young). The linearity of the graphs will demonstrate the validity of Hooke's law. Convert the strain gauge readings

Paxton, Anthony T.


Qualitative Analysis of Common Definitions for Core Advanced Pharmacy Practice Experiences  

PubMed Central

Objective. To determine how colleges and schools of pharmacy interpreted the Accreditation Council for Pharmacy Education’s (ACPE’s) Standards 2007 definitions for core advanced pharmacy practice experiences (APPEs), and how they differentiated community and institutional practice activities for introductory pharmacy practice experiences (IPPEs) and APPEs. Methods. A cross-sectional, qualitative, thematic analysis was done of survey data obtained from experiential education directors in US colleges and schools of pharmacy. Open-ended responses to invited descriptions of the 4 core APPEs were analyzed using grounded theory to determine common themes. Type of college or school of pharmacy (private vs public) and size of program were compared. Results. Seventy-one schools (72%) with active APPE programs at the time of the survey responded. Lack of strong frequent themes describing specific activities for the acute care/general medicine core APPE indicated that most respondents agreed on the setting (hospital or inpatient) but the student experience remained highly variable. Themes were relatively consistent between public and private institutions, but there were differences across programs of varying size. Conclusion. Inconsistencies existed in how colleges and schools of pharmacy defined the core APPEs as required by ACPE. More specific descriptions of core APPEs would help to standardize the core practice experiences across institutions and provide an opportunity for quality benchmarking. PMID:24954931

Danielson, Jennifer; Weber, Stanley S.



Best practices: applying management analysis of excellence to immunization.  


The authors applied business management tools to analyze and promote excellence and to evaluate differences between average and above-average immunization peformers in private practices. The authors conducted a pilot study of 10 private practices in Pennsylvania using tools common in management to assess practices' organizational climate and managerial style. Authoritative and coaching styles of physician leaders were common to both groups. Managerial styles that emphasized higher levels of clarity and responsibility managerial styles were evident in the large practices; and rewards and flexibility styles were higher in the small above-average practices. The findings of this pilot study match results seen in high performers in other industries. It concludes that the authoritative style appears to have the most impact on performance. It has interesting implications for training/behavior change to improve immunization rates, along with traditional medical interventions. PMID:15921143

Wishner, Amy; Aronson, Jerold; Kohrt, Alan; Norton, Gary



Spectroscopic chemical analysis methods and apparatus  

NASA Technical Reports Server (NTRS)

Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

Hug, William F. (Inventor); Reid, Ray D. (Inventor)



Statistical and methodological issues in the analysis of complex sample survey data: Practical guidance for trauma researchers  

Microsoft Academic Search

Standard methods for the analysis of survey data assume that the data arise from a simple random sample of the target population. In practice, analysts of survey data sets collected from nationally representative probability samples often pay little attention to important properties of the survey data. Standard statistical software procedures do not allow analysts to take these properties of survey

Brady T. West



A Practical Method for Determining the Corten-Dolan Exponent and Its Application to Fatigue Life Prediction  

NASA Astrophysics Data System (ADS)

Based on the derivation and calculation of the Corten-Dolan exponent d, a practical method of determining its value is proposed. This exponent depends not only upon the materials, but also upon the load spectrums. Therefore its value is obtained by a function which decreases with increasing stress amplitude. This exponent was investigated through analysis of fatigue damage evolution to determine its parameters. The proposed method has been effectively proved by experimental data from literature. Utilization of the modified Corten-Dolan's model significantly improves its life prediction capability when compared to the conventional model where that exponent was assumed to be constant.

Zhu, Shun-Peng; Huang, Hong-Zhong; Liu, Yu; He, Li-Ping; Liao, Qiang



A Practical, Automated Quality Assurance Method for Measuring Spatial Resolution in PET  

PubMed Central

The use of different scanners, acquisition protocols, and reconstruction algorithms has been identified as a problem that limits the use of PET in multicenter trials. The aim of this project was to aid standardization of data collection by developing a quality assurance method for measuring the spatial resolution achieved with clinical imaging protocols. Methods A commercially available 68Ge cylinder phantom (diameter, 20 cm) with a uniform activity concentration was positioned in the center of the PET field of view, and an image was acquired using typical clinical parameters. Spatial resolution was measured by artificially generating an object function (O) with uniform activity within a 20-cm-diameter cylinder, assuming no noise and perfect spatial resolution, centered on the original image (I); dividing F[I] by F[O], where F indicates a 2-dimensional Fourier transform, to produce a modulation transfer function; and taking the inverse Fourier transform of the modulation transfer function to produce a point-spread function in image space. The method was validated using data acquired on 4 different commercial PET systems. Results Spatial resolution on the Discovery LS was measured at 5.75 ±0.58 mm, compared with 5.54 ±0.19 mm from separate point source measurements. Variability of the resolution measurements differed between scanners and protocols, but the typical SD was approximately 0.15 mm when iterative reconstruction was used. The potential for predicting resolution recovery coefficients for small objects was also demonstrated. Conclusion The proposed method does not require elaborate phantom preparation and is practical to perform, and data analysis is fully automated. This approach is useful for evaluating clinical reconstruction protocols across varying scanners and reconstruction algorithms and should greatly aid standardization of data collection between centers. PMID:19617324

Lodge, Martin A.; Rahmim, Arman; Wahl, Richard L.



Application of quality-improvement methods in a community practice: the Sandhills Pediatrics Asthma Initiative.  


This case study demonstrates the use of quality improvement methods to improve asthma care in a busy community practice. The practice used disease-management strategies, such as population identification, self-management education, and performance measurement and feedback. The practice then applied several practice-based quality improvement methods, such as PDSA cycles, to improve care. From 1998 to 2003, process measures, such as staging of asthmatics, use of long-term control medications, use of peak flow meters and spacers, and use of action plans, improved. There was also a substantial decrease in emergency department use and hospitalizations among patients with asthma. Although there have been several studies demonstrating the efficacy of disease management strategies, most lack generalizability to community practices. Often, interventions are so intensive and cumbersome, that they are unlikely to be replicated in primary care setting. Researchers have been unable to determine which components of the interventions are most effective and replicable. Furthermore, many studies of disease management strategies enroll participants who lack the co-morbidities seen in community practice. There are also few studies of disadvantaged populations that face other barriers to care, such as lack of transportation, poor access to specialists, and medical illiteracy. In this case study, there were several unique factors that enabled the practice to improve care for this population. The AccessCare case manager who worked with the practice not only provided data and feedback to the practice team, but also served as an improvement "coach," often pushing the team and facilitating many of the improvement efforts. AccessCare's approach is in contrast to many of the commercial disease management companies' "carve out" models that do not sufficiently involve providers or practices in their interventions. The other necessary ingredient for success in this project was organizational leadership and support. The leaders of the practice saw beyond the usual metrics of patient visit counts and relative value units (RVUs) to embrace the concept of population health: the notion that practices are not only responsible for providing acute, episodic care in the office, but also for improving health outcomes in the community in which they serve. Other important factors included ensuring a basic agreement among providers on the need for improvement and frequent communication about the goals of the project. Although the champions of the project tried to minimize formal meeting time, there was frequent informal communication between team members. In the future, there is a need to develop other approaches to stimulate these endeavors in community practices, such as "pay for performance" programs, continuing education credit, and tying maintenance of board certification to quality improvement initiatives. PMID:16130947

Wroth, Thomas H; Boals, Joseph C



Simplified method for nonlinear structural analysis  

NASA Technical Reports Server (NTRS)

A simplified inelastic analysis computer program was developed for predicting the stress-strain history of a thermomechanically cycled structure from an elastic solution. The program uses an iterative and incremental procedure to estimate the plastic strains from the material stress-strain properties and a simulated plasticity hardening model. The simplified method was exercised on a number of problems involving uniaxial and multiaxial loading, isothermal and nonisothermal conditions, and different materials and plasticity models. Good agreement was found between these analytical results and nonlinear finite element solutions for these problems. The simplified analysis program used less than 1 percent of the CPU time required for a nonlinear finite element analysis.

Kaufman, A.



Design analysis, robust methods, and stress classification  

SciTech Connect

This special edition publication volume is comprised of papers presented at the 1993 ASME Pressure Vessels and Piping Conference, July 25--29, 1993 in Denver, Colorado. The papers were prepared for presentations in technical sessions developed under the auspices of the PVPD Committees on Computer Technology, Design and Analysis, Operations Applications and Components. The topics included are: Analysis of Pressure Vessels and Components; Expansion Joints; Robust Methods; Stress Classification; and Non-Linear Analysis. Individual papers have been processed separately for inclusion in the appropriate data bases.

Bees, W.J. (ed.)



Simplified Analysis Methods for Primary Load Designs at Elevated Temperatures  

SciTech Connect

The use of simplified (reference stress) analysis methods is discussed and illustrated for primary load high temperature design. Elastic methods are the basis of the ASME Section III, Subsection NH primary load design procedure. There are practical drawbacks with this approach, particularly for complex geometries and temperature gradients. The paper describes an approach which addresses these difficulties through the use of temperature-dependent elastic-perfectly plastic analysis. Correction factors are defined to address difficulties traditionally associated with discontinuity stresses, inelastic strain concentrations and multiaxiality. A procedure is identified to provide insight into how this approach could be implemented but clearly there is additional work to be done to define and clarify the procedural steps to bring it to the point where it could be adapted into code language.

Carter, Peter [Stress Engineering Services Inc.] [Stress Engineering Services Inc.; Jetter, Robert I [Consultant] [Consultant; Sham, Sam [ORNL] [ORNL



Degradation of learned skills: Effectiveness of practice methods on visual approach and landing skill retention  

NASA Technical Reports Server (NTRS)

Flight control and procedural task skill degradation, and the effectiveness of retraining methods were evaluated for a simulated space vehicle approach and landing under instrument and visual flight conditions. Fifteen experienced pilots were trained and then tested after 4 months either without the benefits of practice or with static rehearsal, dynamic rehearsal or with dynamic warmup practice. Performance on both the flight control and procedure tasks degraded significantly after 4 months. The rehearsal methods effectively countered procedure task skill degradation, while dynamic rehearsal or a combination of static rehearsal and dynamic warmup practice was required for the flight control tasks. The quality of the retraining methods appeared to be primarily dependent on the efficiency of visual cue reinforcement.

Sitterley, T. E.; Zaitzeff, L. P.; Berge, W. A.



TECHNICAL REVIEW A practical guide to methods of parentage analysis  

E-print Network

' (Jeffreys et al. 1985). This multi-locus DNA fingerprinting approach was rapidly adopted by avian the spread of DNA fingerprinting applications outside of birds and mammals. Several years after the development of DNA fingerprinting, the discovery of microsatellite markers (Tautz 1989), also known as simple

Jones, Adam


Application of inertia methods to benthic marine ecology: Practical implications of the basic options*1  

Microsoft Academic Search

The various so-called inertia methods, the aim of which is to summarize the relationships between points by a configuration of reduced dimension, may al1 be considered variants of a more general method, In this regard, the links existing between Principal Component Analysis, Principal Co- ordinates Analysis and the Analysis of Correspondences appear clearly. Three fundamental options (inherent in al1 inertia

P. Chardy; M. Glemarec; A. Laurec



Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues  

SciTech Connect

This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

Ronald Laurids Boring



Common Goals for the Science and Practice of Behavior Analysis: A Response to Critchfield  

ERIC Educational Resources Information Center

In his scholarly and thoughtful article, "Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis," Critchfield (2011) discussed the science-practice frictions to be expected in any professional organization that attempts to combine these interests. He suggested that the Association for Behavior Analysis

Schneider, Susan M.



Internet Practices of Certified Rehabilitation Counselors and Analysis of Guidelines for Ethical Internet Practices  

ERIC Educational Resources Information Center

The Internet has become an integral part of the practice of rehabilitation counseling. To identify potential ethical issues regarding the use of the Internet by counselors, two studies were conducted. In Study 1, we surveyed a national sample of rehabilitation counselors regarding their use of technology in their work and home settings. Results…

Lehmann, Ilana S.; Crimando, William



Analysis of sourcing & procurement practices : a cross industry framework  

E-print Network

This thesis presents and analyzes the various practices in the functional area of Sourcing and Procurement. The 21 firms that are studied operate in one of the following industries: Aerospace, Apparel/ Footwear, Automotive, ...

Koliousis, Ioannis G



On exploratory factor analysis: a review of recent evidence, an assessment of current practice, and recommendations for future use.  


Exploratory factor analysis (hereafter, factor analysis) is a complex statistical method that is integral to many fields of research. Using factor analysis requires researchers to make several decisions, each of which affects the solutions generated. In this paper, we focus on five major decisions that are made in conducting factor analysis: (i) establishing how large the sample needs to be, (ii) choosing between factor analysis and principal components analysis, (iii) determining the number of factors to retain, (iv) selecting a method of data extraction, and (v) deciding upon the methods of factor rotation. The purpose of this paper is threefold: (i) to review the literature with respect to these five decisions, (ii) to assess current practices in nursing research, and (iii) to offer recommendations for future use. The literature reviews illustrate that factor analysis remains a dynamic field of study, with recent research having practical implications for those who use this statistical method. The assessment was conducted on 54 factor analysis (and principal components analysis) solutions presented in the results sections of 28 papers published in the 2012 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. The main findings from the assessment were that researchers commonly used (a) participants-to-items ratios for determining sample sizes (used for 43% of solutions), (b) principal components analysis (61%) rather than factor analysis (39%), (c) the eigenvalues greater than one rule and screen tests to decide upon the numbers of factors/components to retain (61% and 46%, respectively), (d) principal components analysis and unweighted least squares as methods of data extraction (61% and 19%, respectively), and (e) the Varimax method of rotation (44%). In general, well-established, but out-dated, heuristics and practices informed decision making with respect to the performance of factor analysis in nursing studies. Based on the findings from factor analysis research, it seems likely that the use of such methods may have had a material, adverse effect on the solutions generated. We offer recommendations for future practice with respect to each of the five decisions discussed in this paper. PMID:24183474

Gaskin, Cadeyrn J; Happell, Brenda



Integrated method for chaotic time series analysis  


Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

Hively, L.M.; Ng, E.G.



Model validation based on residuals analysis method  

Microsoft Academic Search

A model validation approach based on residuals analysis is presented to uncertain system with unmodelled dynamics. Since the effect of unmodelling errors, the residual signal is not only induced by the noise but also the unmodelling error. Therefore, in open-loop condition, model validation is firstly transferred into a hypothesis validation and a new residual estimation method is proposed. Though analyzing

Zong Qun; Dou Liqian; Sun Liankun; Liu Wenjing



Methods for Chemical Analysis of Fresh Waters.  

ERIC Educational Resources Information Center

This manual, one of a series prepared for the guidance of research workers conducting studies as part of the International Biological Programme, contains recommended methods for the analysis of fresh water. The techniques are grouped in the following major sections: Sample Taking and Storage; Conductivity, pH, Oxidation-Reduction Potential,…

Golterman, H. L.


Statistical Methods in Algorithm Design and Analysis.  

ERIC Educational Resources Information Center

The use of statistical methods in the design and analysis of discrete algorithms is explored. The introductory chapter contains a literature survey and background material on probability theory. In Chapter 2, probabilistic approximation algorithms are discussed with the goal of exposing and correcting some oversights in previous work. Chapter 3…

Weide, Bruce W.


Expression Data Analysis Systems and Methods.  

National Technical Information Service (NTIS)

Systems and methods for performing rapid genomic DNA analysis of samples, such as control samples and experimental samples. In one aspect, the system makes use of genomic DNA input, rather than gene expression input such as mRNA and/or cDNA associated wit...

D. Roopenian, D. J. Shaffer, K. D. Mills, S. Akilesh



Application of Stacking Technique in ANA: Method and Practice with PKU Seismological Array  

NASA Astrophysics Data System (ADS)

Cross correlation of ambient noise records is now routinely used to get dispersion curve and then do seismic tomography; however little attention has been paid to array techniques. We will present a spacial-stacking method to get high resolution dispersion curves and show practices with the observation data of PKU seismological array. Experiential Green Functions are generally obtained by correlation between two stations, and then the dispersion curves are obtained from the analysis of FTAN. Popular method to get high resolution dispersion curves is using long time records. At the same time, if we want to get effectual signal, the distance between the two stations must be at least 3 times of the longest wavelength. So we need both long time records and appropriate spaced stations. Now we use a new method, special-stacking, which allows shorter observation period and utilizes observations of a group of closely distributed stations to get fine dispersion curves. We correlate observations of every station in the station group with those of a far station, and then stack them together. However we cannot just simply stack them unless the stations in the station group at a circle, of which the center is the far station owing to dispersion characteristics of the Rayleigh waves. Thus we do antidispersion on the observation data of every station in the array, then do stacking. We test the method using the theoretical seismic surface wave records which obtained by qseis06 compiled by Rongjiang Wang both with and without noise. For the cases of three imaginary stations (distance is 1 degree) have the same underground structure and without noise, result is that the center station had the same dispersion with and without spacial-stacking. Then we add noise to the theoretical records. The center station's dispersion curves obtained by our method are much closer to the dispersion curve without noise than contaminated ones. We can see that our method has improved the resolution of the dispersion curve. Then we use the real data from PKU array whose interval is about 10 km and the permanent stations of IRIS which is far (more than 200 km) from PKU array, to test the method. Firstly, we compare the stacked correlation results of the three consecutive stations with uncorrelated ones, finding the resolution of the dispersion curve of the former is better. Secondly, we compare the stacked results with the results of center station's traditional correlation in one year, and find the two fit very well.

Liu, J.; Tang, Y.; Ning, J.; Chen, Y. J.



Practical considerations for volumetric wear analysis of explanted hip arthroplasties  

PubMed Central

Objectives Wear debris released from bearing surfaces has been shown to provoke negative immune responses in the recipient. Excessive wear has been linked to early failure of prostheses. Analysis using coordinate measuring machines (CMMs) can provide estimates of total volumetric material loss of explanted prostheses and can help to understand device failure. The accuracy of volumetric testing has been debated, with some investigators stating that only protocols involving hundreds of thousands of measurement points are sufficient. We looked to examine this assumption and to apply the findings to the clinical arena. Methods We examined the effects on the calculated material loss from a ceramic femoral head when different CMM scanning parameters were used. Calculated wear volumes were compared with gold standard gravimetric tests in a blinded study. Results Various scanning parameters including point pitch, maximum point to point distance, the number of scanning contours or the total number of points had no clinically relevant effect on volumetric wear calculations. Gravimetric testing showed that material loss can be calculated to provide clinically relevant degrees of accuracy. Conclusions Prosthetic surfaces can be analysed accurately and rapidly with currently available technologies. Given these results, we believe that routine analysis of explanted hip components would be a feasible and logical extension to National Joint Registries. Cite this article: Bone Joint Res 2014;3:60–8. PMID:24627327

Langton, D. J.; Sidaginamale, R. P.; Holland, J. P.; Deehan, D.; Joyce, T. J.; Nargol, A. V. F.; Meek, R. D.; Lord, J. K.



Methods of quantitative fire hazard analysis  

SciTech Connect

Simplified fire hazard analysis methods have been developed as part of the FIVE risk-based fire induced vulnerability evaluation methodology for nuclear power plants. These fire hazard analyses are intended to permit plant fire protection personnel to conservatively evaluate the potential for credible exposure fires to cause critical damage to essential safe-shutdown equipment and thereby screen from further analysis spaces where a significant fire hazard clearly does not exist. This document addresses the technical bases for the fire hazard analysis methods. A separate user's guide addresses the implementation of the fire screening methodology, which has been implemented with three worksheets and a number of look-up tables. The worksheets address different locations of targets relative to exposure fire sources. The look-up tables address fire-induced conditions in enclosures in terms of three stages: a fire plume/ceiling jet period, an unventilated enclosure smoke filling period and a ventilated quasi-steady period.

Mowrer, F.W. (Mowrer (Frederick W.), Adelphi, MD (United States))



Multiple predictor smoothing methods for sensitivity analysis.  

SciTech Connect

The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

Helton, Jon Craig; Storlie, Curtis B.



Digital Forensics Analysis of Spectral Estimation Methods  

E-print Network

Steganography is the art and science of writing hidden messages in such a way that no one apart from the intended recipient knows of the existence of the message. In today's world, it is widely used in order to secure the information. In this paper, the traditional spectral estimation methods are introduced. The performance analysis of each method is examined by comparing all of the spectral estimation methods. Finally, from utilizing those performance analyses, a brief pros and cons of the spectral estimation methods are given. Also we give a steganography demo by hiding information into a sound signal and manage to pull out the information (i.e, the true frequency of the information signal) from the sound by means of the spectral estimation methods.

Mataracioglu, Tolga



Modeling of 2D diffusion processes based on microscopy data: parameter estimation and practical identifiability analysis  

PubMed Central

Background Diffusion is a key component of many biological processes such as chemotaxis, developmental differentiation and tissue morphogenesis. Since recently, the spatial gradients caused by diffusion can be assessed in-vitro and in-vivo using microscopy based imaging techniques. The resulting time-series of two dimensional, high-resolutions images in combination with mechanistic models enable the quantitative analysis of the underlying mechanisms. However, such a model-based analysis is still challenging due to measurement noise and sparse observations, which result in uncertainties of the model parameters. Methods We introduce a likelihood function for image-based measurements with log-normal distributed noise. Based upon this likelihood function we formulate the maximum likelihood estimation problem, which is solved using PDE-constrained optimization methods. To assess the uncertainty and practical identifiability of the parameters we introduce profile likelihoods for diffusion processes. Results and conclusion As proof of concept, we model certain aspects of the guidance of dendritic cells towards lymphatic vessels, an example for haptotaxis. Using a realistic set of artificial measurement data, we estimate the five kinetic parameters of this model and compute profile likelihoods. Our novel approach for the estimation of model parameters from image data as well as the proposed identifiability analysis approach is widely applicable to diffusion processes. The profile likelihood based method provides more rigorous uncertainty bounds in contrast to local approximation methods. PMID:24267545



Education Policy as a Practice of Power: Theoretical Tools, Ethnographic Methods, Democratic Options  

ERIC Educational Resources Information Center

This article outlines some theoretical and methodological parameters of a critical practice approach to policy. The article discusses the origins of this approach, how it can be uniquely adapted to educational analysis, and why it matters--not only for scholarly interpretation but also for the democratization of policy processes as well. Key to…

Levinson, Bradley A. U.; Sutton, Margaret; Winstead, Teresa



Mixed-methods research in pharmacy practice: basics and beyond (part 1).  


This is the first of two papers which explore the use of mixed-methods research in pharmacy practice. In an era of evidence-based medicine and policy, high-quality research evidence is essential for the development of effective pharmacist-led services. Over the past decade, the use of mixed-methods research has become increasingly common in healthcare, although to date its use has been relatively limited in pharmacy practice research. In this article, the basic concepts of mixed-methods research including its definition, typologies and advantages in relation to pharmacy practice research are discussed. Mixed-methods research brings together qualitative and quantitative methodologies within a single study to answer or understand a research problem. There are a number of mixed-methods designs available, but the selection of an appropriate design must always be dictated by the research question. Importantly, mixed-methods research should not be seen as a 'tool' to collect qualitative and quantitative data, rather there should be some degree of 'integration' between the two data sets. If conducted appropriately, mixed-methods research has the potential to generate quality research evidence by combining strengths and overcoming the respective limitations of qualitative and quantitative methodologies. PMID:23418918

Hadi, Muhammad Abdul; Alldred, David Phillip; Closs, S José; Briggs, Michelle



Text analysis devices, articles of manufacture, and text analysis methods  


Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.

Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C



Practical Implementation of New Particle Tracking Method to the Real Field of Groundwater Flow and Transport  

PubMed Central

Abstract In articles published in 2009 and 2010, Suk and Yeh reported the development of an accurate and efficient particle tracking algorithm for simulating a path line under complicated unsteady flow conditions, using a range of elements within finite elements in multidimensions. Here two examples, an aquifer storage and recovery (ASR) example and a landfill leachate migration example, are examined to enhance the practical implementation of the proposed particle tracking method, known as Suk's method, to a real field of groundwater flow and transport. Results obtained by Suk's method are compared with those obtained by Pollock's method. Suk's method produces superior tracking accuracy, which suggests that Suk's method can describe more accurately various advection-dominated transport problems in a real field than existing popular particle tracking methods, such as Pollock's method. To illustrate the wide and practical applicability of Suk's method to random-walk particle tracking (RWPT), the original RWPT has been modified to incorporate Suk's method. Performance of the modified RWPT using Suk's method is compared with the original RWPT scheme by examining the concentration distributions obtained by the modified RWPT and the original RWPT under complicated transient flow systems. PMID:22476629

Suk, Heejun



Connecting Practice, Theory and Method: Supporting Professional Doctoral Students in Developing Conceptual Frameworks  

ERIC Educational Resources Information Center

From an instrumental view, conceptual frameworks that are carefully assembled from existing literature in Educational Technology and related disciplines can help students structure all aspects of inquiry. In this article we detail how the development of a conceptual framework that connects theory, practice and method is scaffolded and facilitated…

Kumar, Swapna; Antonenko, Pavlo



Communities of Practice: A Research Paradigm for the Mixed Methods Approach  

ERIC Educational Resources Information Center

The mixed methods approach has emerged as a "third paradigm" for social research. It has developed a platform of ideas and practices that are credible and distinctive and that mark the approach out as a viable alternative to quantitative and qualitative paradigms. However, there are also a number of variations and inconsistencies within the mixed…

Denscombe, Martyn



Econometric Methods for Causal Evaluation of Education Policies and Practices: A Non-Technical Guide  

ERIC Educational Resources Information Center

Education policy-makers and practitioners want to know which policies and practices can best achieve their goals. But research that can inform evidence-based policy often requires complex methods to distinguish causation from accidental association. Avoiding econometric jargon and technical detail, this paper explains the main idea and intuition…

Schlotter, Martin; Schwerdt, Guido; Woessmann, Ludger



Praxis to Practice: Putting Qualitative Methods To Work for Rural Education.  

ERIC Educational Resources Information Center

This paper examines issues and areas of concern for the educational researcher moving from the relative safety of academic research to the more perilous arena of practice-oriented or action-oriented qualitative research. The first question is one of purity or objectivity: giving credibility to research results by imposing adequate rigor on methods

Lawrence, Barbara Kent


Practical method for determining the minimum embedding dimension of a scalar time series  

Microsoft Academic Search

A practical method is proposed to determine the minimum embedding dimension from a scalar time series. It has the following advantages: (1) does not contain any subjective parameters except for the time-delay for the embedding; (2) does not strongly depend on how many data points are available; (3) can clearly distinguish deterministic signals from stochastic signals; (4) works well for

Cao Liangyue



Using the Patient as Teacher: A Training Method for Family Practice Residents in Behavioral Science  

Microsoft Academic Search

Since the inception of family medicine as a specialty in allopathy and osteopathy in 1969 and 1973, respectively, there has been a need to develop integrative approaches of teaching behavioral science concepts without violating the scope of practice limitations between the fields. We describe a collaborative training method by which we attempt to achieve this balance. Residents referring patients for

Janis L. Lewis; DeVon R. Stokes; Lawrence R. Fischetti; Aaron L. Rutledge



Urinary tract infection in general practice: Direct antibiotic sensitivity testing as a potential diagnostic method  

Microsoft Academic Search

Direct Antibiotic Sensitivity Testing (DST) is a rapid means of diagnosing urinary tract infection (UTI) and obtaining antibiotic sensitivity patterns of the infecting organisms. In this study 227 urine samples from general practice were analysed using this technique and the results obtained were compared with those obtained using the standard laboratory method. DST was shown to be 94.6% sensitive, and

P. G. Scully; B. O’Shea; K. P. Flanagan; F. R. Falkiner



Stabilization of Peat by Deep Mixing Method: A Critical Review of the State of Practice  

Microsoft Academic Search

The purpose of this paper is to advance the knowledge on peat soil stabilization by critically examining and documenting the current state of practice. Deep Mixing method is emphasised on column type techniques using lime\\/cement. This paper is essentially a comprehensive review of available academic literature on deep soil stabilization utilizing this approach. Deep mixing with lime or lime-cement column

Shahidul Islam; Roslan Hashim


Optimisation of Lime-Soda process parameters for reduction of hardness in aqua-hatchery practices using Taguchi methods.  


This paper presents the optimisation of Lime-Soda process parameters for the reduction of hardness in aqua-hatchery practices in the context of M. rosenbergii. The fresh water in the development of fisheries needs to be of suitable quality. Lack of desirable quality in available fresh water is generally the confronting restraint. On the Indian subcontinent, groundwater is the only source of raw water, having varying degree of hardness and thus is unsuitable for the fresh water prawn hatchery practices (M. rosenbergii). In order to make use of hard water in the context of aqua-hatchery, Lime-Soda process has been recommended. The efficacy of the various process parameters like lime, soda ash and detention time, on the reduction of hardness needs to be examined. This paper proposes to determine the parameter settings for the CIFE well water, which is pretty hard by using Taguchi experimental design method. Orthogonal Arrays of Taguchi, Signal-to-Noise Ratio, the analysis of variance (ANOVA) have been applied to determine their dosage and analysed for their effect on hardness reduction. The tests carried out with optimal levels of Lime-Soda process parameters confirmed the efficacy of the Taguchi optimisation method. Emphasis has been placed on optimisation of chemical doses required to reduce the total hardness using Taguchi method and ANOVA, to suit the available raw water quality for aqua-hatchery practices, especially for fresh water prawn M. rosenbergii. PMID:24749379

Yavalkar, S P; Bhole, A G; Babu, P V Vijay; Prakash, Chandra



Comparison of analysis methods for airway quantification  

NASA Astrophysics Data System (ADS)

Diseased airways have been known for several years as a possible contributing factor to airflow limitation in Chronic Obstructive Pulmonary Diseases (COPD). Quantification of disease severity through the evaluation of airway dimensions - wall thickness and lumen diameter - has gained increased attention, thanks to the availability of multi-slice computed tomography (CT). Novel approaches have focused on automated methods of measurement as a faster and more objective means that the visual assessment routinely employed in the clinic. Since the Full-Width Half-Maximum (FWHM) method of airway measurement was introduced two decades ago [1], several new techniques for quantifying airways have been detailed in the literature, but no approach has truly become a standard for such analysis. Our own research group has presented two alternative approaches for determining airway dimensions, one involving a minimum path and the other active contours [2, 3]. With an increasing number of techniques dedicated to the same goal, we decided to take a step back and analyze the differences of these methods. We consequently put to the test our two methods of analysis and the FWHM approach. We first measured a set of 5 airways from a phantom of known dimensions. Then we compared measurements from the three methods to those of two independent readers, performed on 35 airways in 5 patients. We elaborate on the differences of each approach and suggest conclusions on which could be defined as the best one.

Odry, Benjamin L.; Kiraly, Atilla P.; Novak, Carol L.; Naidich, David P.



Measurement methods for human exposure analysis.  

PubMed Central

The general methods used to complete measurements of human exposures are identified and illustrations are provided for the cases of indirect and direct methods used for exposure analysis. The application of the techniques for external measurements of exposure, microenvironmental and personal monitors, are placed in the context of the need to test hypotheses concerning the biological effects of concern. The linkage of external measurements to measurements made in biological fluids is explored for a suite of contaminants. This information is placed in the context of the scientific framework used to conduct exposure assessment. Examples are taken from research on volatile organics and for a large scale problem: hazardous waste sites. PMID:7635110

Lioy, P J



A Comparative Analysis of Ethnomedicinal Practices for Treating Gastrointestinal Disorders Used by Communities Living in Three National Parks (Korea)  

PubMed Central

The purpose of this study is to comparatively analyze the ethnomedicinal practices on gastrointestinal disorders within communities in Jirisan National Park, Gayasan National Park, and Hallasan National Park of Korea. Data was collected through participant observations and indepth interviews with semistructured questionnaires. Methods for comparative analysis were accomplished using the informant consensus factor, fidelity level, and internetwork analysis. A total of 490 ethnomedicinal practices recorded from the communities were classified into 110 families, 176 genera, and 220 species that included plants, animals, fungi, and alga. The informant consensus factor values in the disorder categories were enteritis, and gastralgia (1.0), followed by indigestion (0.94), constipation (0.93), and abdominal pain and gastroenteric trouble (0.92). In terms of fidelity levels, 71 plant species showed fidelity levels of 100%. The internetwork analysis between disorders and all medicinal species are grouped in the center by the four categories of indigestion, diarrhea, abdominal pain, and gastroenteric trouble, respectively. Regarding the research method of this study, the comparative analysis methods will contribute to the availability of orally transmitted ethnomedicinal knowledge. Among the methods of analysis, the use of internetwork analysis as a tool for analysis in this study provides imperative internetwork maps between gastrointestinal disorders and medicinal species. PMID:25202330

Kim, Hyun; Song, Mi-Jang; Brian, Heldenbrand; Choi, Kyoungho



Cost management practices for supply chain management: an exploratory analysis  

Microsoft Academic Search

Cost management within a supply chain management domain has lately received a great deal of interest from academics and practitioners; however, the literature is still dominated by conceptual and anecdotal work. The major issue is that it is difficult at best to draw conclusion with any level of confidence concerning the actual degree of usage of various cost management practices.

Stephan M. Wagner



New HRM Practices and Exploitative Innovation: A Shopfloor Level Analysis  

Microsoft Academic Search

Extant research documents a positive relationship between the adoption of new human resource management (HRM) practices at the managerial and shopfloor level, and innovation performance, respectively. However, studies focusing on the managerial level distinguish between different types of innovation, while studies at the shopfloor level regard innovation as a homogenous activity. No previous studies have explicitly accounted for innovation heterogeneity

Grazia D. Santangelo; Paolo Pini



Mentoring Beginning Teachers in Secondary Schools: An Analysis of Practice  

ERIC Educational Resources Information Center

The conditions that promote best practice in the mentoring of beginning teachers in secondary schools are explored in this paper in relation to the experiential model of learning put forward by Kolb [(1984). "Experiential learning: Experience as the source of learning and development." New York: Prentice-Hall]. The underpinning processes of this…

Harrison, Jennifer; Dymoke, Sue; Pell, Tony



Professional Learning in Rural Practice: A Sociomaterial Analysis  

ERIC Educational Resources Information Center

Purpose: This paper aims to examine the professional learning of rural police officers. Design/methodology/approach: This qualitative case study involved interviews and focus groups with 34 police officers in Northern Scotland. The interviews and focus groups were transcribed and analysed, drawing on practice-based and sociomaterial learning…

Slade, Bonnie



Honesty in Critically Reflective Essays: An Analysis of Student Practice  

ERIC Educational Resources Information Center

In health professional education, reflective practice is seen as a potential means for self-improvement from everyday clinical encounters. This study aims to examine the level of student honesty in critical reflection, and barriers and facilitators for students engaging in honest reflection. Third year physiotherapy students, completing summative…

Maloney, Stephen; Tai, Joanna Hong-Meng; Lo, Kristin; Molloy, Elizabeth; Ilic, Dragan



A Practical Guide to Data Analysis for Physical Science Students  

Microsoft Academic Search

This textbook is intended for undergraduates who are carrying out laboratory experiments in the physical sciences for the first time. It is a practical guide on how to analyze data and estimate errors. The necessary formulas for performing calculations are given, and the ideas behind them are explained, although this is not a formal text on statistics. Specific examples are

Louis Lyons



Initial Public Offerings: An Analysis of Theory and Practice  

Microsoft Academic Search

We survey 336 chief financial officers (CFOs) to compare practice to theory in the areas of initial public offering (IPO) motivation, timing, underwriter selection, underpricing, signaling, and the decision to remain private. We find the primary motivation for going public is to facilitate acquisitions. CFOs base IPO timing on overall market conditions, are well informed regarding expected underpricing, and feel




Primary methods of measurement in chemical analysis  

Microsoft Academic Search

Primary methods of measurement have a central function in metrology. They are an essential component in the realisation of\\u000a the SI units and therefore are indispensable for establishing traceability of measurements of all kinds of physical quantities\\u000a to the corresponding SI units. This is also true for chemical analysis. Gravimetry, titrimetry, coulometry, and isotope dilution\\u000a mass spectrometry (IDMS) are evaluated

W. Richter



Finite Volume Methods: Foundation and Analysis  

NASA Technical Reports Server (NTRS)

Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

Barth, Timothy; Ohlberger, Mario



Spectroscopic chemical analysis methods and apparatus  

NASA Technical Reports Server (NTRS)

Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)



A Mixed Methods Content Analysis of the Research Literature in Science Education  

ERIC Educational Resources Information Center

In recent years, more and more researchers in science education have been turning to the practice of combining qualitative and quantitative methods in the same study. This approach of using mixed methods creates possibilities to study the various issues that science educators encounter in more depth. In this content analysis, I evaluated 18…

Schram, Asta B.



Data Analysis Methods for Library Marketing  

NASA Astrophysics Data System (ADS)

Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

Minami, Toshiro; Kim, Eunja


Review of Computational Stirling Analysis Methods  

NASA Technical Reports Server (NTRS)

Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.



Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis  

PubMed Central

Neither practitioners nor scientists appear to be fully satisfied with the world's largest behavior-analytic membership organization. Each community appears to believe that initiatives that serve the other will undermine the association's capacity to serve their own needs. Historical examples suggest that such discord is predicted when practitioners and scientists cohabit the same association. This is true because all professional associations exist to address guild interests, and practice and science are different professions with different guild interests. No association, therefore, can succeed in being all things to all people. The solution is to assure that practice and science communities are well served by separate professional associations. I comment briefly on how this outcome might be promoted. PMID:22532750

Critchfield, Thomas S



An International Analysis of the Role of Religion and Spirituality in Social Work Practice  

E-print Network

assess the impor­ tance of religion and spirituality in their practices? As an opportuni ty to reflect on our own practices, what can Norwegian and U.S. social workers learn from each other regarding the integration of religion and spirituality...SPIRITUALITY AND SOCIAL WORK © FIS An International Analysis of the Role of Religion and Spirituality in Social Work Practice Leola Dyrud Furman, Mari-Anne Zahl, Perry W. Benson, & Edward R. Canda ABSTRACT As service popu la t ions have...

Furman, Leola Dyrud; Zahl, Mari-Anne; Benson, Perry W.; Canda, Edward R.



Economic Analysis of Integrated Crop Management Practices of 'Navel' Oranges  

Microsoft Academic Search

The effect of various integrated crop management practices on productivity (fruit yield, grade, and size) and returns of 'Washington Navel' oranges (Citrus sinensis (L.) Osbeck) was determined in the San Joaquin Valley of California. Seventy-two combinations of treatments comprised of three irrigation levels (80%,100%, and 120% evapotranspiration demand (ETc)), three N fertilizer levels (low, medium, and high based on 2.3%,2.5%,

John A. Menge; John E. Pehrson; Jewell L. Meyer; Charles W. Coggins


Regulating forest practices in Texas: a problem analysis  

E-print Network

application in future research efforts dealing with impacts of various forest practices. Dr. R. G. Merrifield, Head, Department of Forest Science served as a light house throughout the problem selection and research processes. His experience and judgement... organizations. 2) Implementation through public ownership and control. Public forest managers may be likened to the trustees of a trust (the forest) and the public to its beneficiaries. Viewed in this light these mana- gers have an obligation to be cognizant...

Dreesen, Alan D



Physical activity assessment in practice: a mixed methods study of GPPAQ use in primary care  

PubMed Central

Background Insufficient physical activity (PA) levels which increase the risk of chronic disease are reported by almost two-thirds of the population. More evidence is needed about how PA promotion can be effectively implemented in general practice (GP), particularly in socio-economically disadvantaged communities. One tool recommended for the assessment of PA in GP and supported by NICE (National Institute for Health and Care Excellence) is The General Practice Physical Activity Questionnaire (GPPAQ) but details of how it may be used and of its acceptability to practitioners and patients are limited. This study aims to examine aspects of GPPAQ administration in non-urgent patient contacts using different primary care electronic recording systems and to explore the views of health professionals regarding its use. Methods Four general practices, selected because of their location within socio-economically disadvantaged areas, were invited to administer GPPAQs to patients, aged 35-75 years, attending non-urgent consultations, over two-week periods. They used different methods of administration and different electronic medical record systems (EMIS, Premiere, Vision). Participants’ (general practitioners (GPs), nurses and receptionists) views regarding GPPAQ use were explored via questionnaires and focus groups. Results Of 2,154 eligible consultations, 192 (8.9%) completed GPPAQs; of these 83 (43%) were categorised as inactive. All practices were located within areas ranked as being in the tertile of greatest socio-economic deprivation in Northern Ireland. GPs/nurses in two practices invited completion of the GPPAQ, receptionists did so in two. One practice used an electronic template; three used paper copies of the questionnaires. End-of-study questionnaires, completed by 11 GPs, 3 nurses and 2 receptionists and two focus groups, with GPs (n?=?8) and nurses (n?=?4) indicated that practitioners considered the GPPAQ easy to use but not in every consultation. Its use extended consultation time, particularly for patients with complex problems who could potentially benefit from PA promotion. Conclusions GPs and nurses reported that the GPPAQ itself was an easy tool with which to assess PA levels in general practice and feasible to use in a range of electronic record systems but integration within routine practice is constrained by time and complex consultations. Further exploration of ways to facilitate PA promotion into practice is needed. PMID:24422666



Comparison and cost analysis of drinking water quality monitoring requirements versus practice in seven developing countries.  


Drinking water quality monitoring programs aim to support provision of safe drinking water by informing water quality management. Little evidence or guidance exists on best monitoring practices for low resource settings. Lack of financial, human, and technological resources reduce a country's ability to monitor water supply. Monitoring activities were characterized in Cambodia, Colombia, India (three states), Jordan, Peru, South Africa, and Uganda according to water sector responsibilities, monitoring approaches, and marginal cost. The seven study countries were selected to represent a range of low resource settings. The focus was on monitoring of microbiological parameters, such as E. coli, coliforms, and H2S-producing microorganisms. Data collection involved qualitative and quantitative methods. Across seven study countries, few distinct approaches to monitoring were observed, and in all but one country all monitoring relied on fixed laboratories for sample analysis. Compliance with monitoring requirements was highest for operational monitoring of large water supplies in urban areas. Sample transport and labor for sample collection and analysis together constitute approximately 75% of marginal costs, which exclude capital costs. There is potential for substantive optimization of monitoring programs by considering field-based testing and by fundamentally reconsidering monitoring approaches for non-piped supplies. This is the first study to look quantitatively at water quality monitoring practices in multiple developing countries. PMID:25046632

Crocker, Jonny; Bartram, Jamie



Comparison and Cost Analysis of Drinking Water Quality Monitoring Requirements versus Practice in Seven Developing Countries  

PubMed Central

Drinking water quality monitoring programs aim to support provision of safe drinking water by informing water quality management. Little evidence or guidance exists on best monitoring practices for low resource settings. Lack of financial, human, and technological resources reduce a country’s ability to monitor water supply. Monitoring activities were characterized in Cambodia, Colombia, India (three states), Jordan, Peru, South Africa, and Uganda according to water sector responsibilities, monitoring approaches, and marginal cost. The seven study countries were selected to represent a range of low resource settings. The focus was on monitoring of microbiological parameters, such as E. coli, coliforms, and H2S-producing microorganisms. Data collection involved qualitative and quantitative methods. Across seven study countries, few distinct approaches to monitoring were observed, and in all but one country all monitoring relied on fixed laboratories for sample analysis. Compliance with monitoring requirements was highest for operational monitoring of large water supplies in urban areas. Sample transport and labor for sample collection and analysis together constitute approximately 75% of marginal costs, which exclude capital costs. There is potential for substantive optimization of monitoring programs by considering field-based testing and by fundamentally reconsidering monitoring approaches for non-piped supplies. This is the first study to look quantitatively at water quality monitoring practices in multiple developing countries. PMID:25046632

Crocker, Jonny; Bartram, Jamie



A practical method to determine the heating and cooling curves of x-ray tube assemblies  

SciTech Connect

A practical method to determine the heating and cooling curves of x-ray tube assemblies with rotating anode x-ray tube is proposed. Available procedures to obtain these curves as described in the literature are performed during operation of the equipment, and the precision of the method depends on the knowledge of the total energy applied in the system. In the present work we describe procedures which use a calorimetric system and do not require the operation of the x-ray equipment. The method was applied successfully to a x-ray tube assembly that was under test in our laboratory.

Bottaro, M.; Moralles, M.; Viana, V.; Donatiello, G. L.; Silva, E. P. [Instituto de Eletrotecnica e Energia da Universidade de Sao Paulo, Av. Prof. Luciano Gualberto, 1289, CEP 05508-010, Sao Paulo, SP (Brazil); Instituto de Pesquisas Energeticas e Nucleares, Av. Prof. Lineu Prestes, 2.242, CEP 05508-000 Sao Paulo, SP (Brazil); Instituto de Eletrotecnica e Energia da Universidade de Sao Paulo, Av. Prof. Luciano Gualberto, 1289, CEP 05508-010, Sao Paulo, SP (Brazil)



Practice characteristics and prior authorization costs: secondary analysis of data collected by SALT-Net in 9 central New York primary care practices  

PubMed Central

Background An increase in prior authorization (PA) requirements from health insurance companies is placing administrative and financial burdens on primary care offices across the United States. As time allocation for these cases continues to grow, physicians are concerned with additional workload and inefficiency in the workplace. The objective is to estimate the effects of practice characteristics on time spent per prior authorization request in primary care practices. Methods Secondary analysis was performed using data on nine primary care practices in Central New York. Practice characteristics and demographics were collected at the onset of the study. In addition, participants were instructed to complete an "event form" (EF) to document each prior authorization event during a 4–6 week period; prior authorizations included requests for medication as well as other health care services. Stepwise Ordinary Least Squares (OLS) Regression was used to model Time in Minutes of each event as an outcome of various factors. Results Prior authorization events (N?=?435) took roughly 20 minutes to complete (beta?=?20.017, p?



Searching Usenet for Virtual Communities of Practice: Using Mixed Methods to Identify the Constructs of Wenger's Theory  

ERIC Educational Resources Information Center

Introduction: This research set out to determine whether communities of practice can be entirely Internet-based by formally applying Wenger's theoretical framework to Internet collectives. Method: A model of a virtual community of practice was developed which included the constructs Wenger identified in co-located communities of practice: mutual…

Murillo, Enrique



A high-efficiency aerothermoelastic analysis method  

NASA Astrophysics Data System (ADS)

In this paper, a high-efficiency aerothermoelastic analysis method based on unified hypersonic lifting surface theory is established. The method adopts a two-way coupling form that couples the structure, aerodynamic force, and aerodynamic thermo and heat conduction. The aerodynamic force is first calculated based on unified hypersonic lifting surface theory, and then the Eckert reference temperature method is used to solve the temperature field, where the transient heat conduction is solved using Fourier's law, and the modal method is used for the aeroelastic correction. Finally, flutter is analyzed based on the p-k method. The aerothermoelastic behavior of a typical hypersonic low-aspect ratio wing is then analyzed, and the results indicate the following: (1) the combined effects of the aerodynamic load and thermal load both deform the wing, which would increase if the flexibility, size, and flight time of the hypersonic aircraft increase; (2) the effect of heat accumulation should be noted, and therefore, the trajectory parameters should be considered in the design of hypersonic flight vehicles to avoid hazardous conditions, such as flutter.

Wan, ZhiQiang; Wang, YaoKun; Liu, YunZhen; Yang, Chao



Evaluating Practical Negotiating Agents: Results and Analysis of the 2011 International Competition  

E-print Network

Evaluating Practical Negotiating Agents: Results and Analysis of the 2011 International Competition to develop successful automated negotiation agents for scenarios where there is no information about-of-the-art in the area of practical bilateral multi-issue negotiations, and to encourage the design of agents

Ito, Takayuki


A Practical Ion Trap Mass Spectrometer for the Analysis of Peptides by Matrix-Assisted Laser  

E-print Network

A Practical Ion Trap Mass Spectrometer for the Analysis of Peptides by Matrix-Assisted Laser of peptide ions. The new instrument is demonstrated to be a highly practical tool for analyzing proteins. In particular, mixtures containing as many as 30 peptide components can be rapidly and sensitively analyzed

Chait, Brian T.


3D scanning technology as a standard archaeological tool for pottery analysis: practice and theory  

E-print Network

3D scanning technology as a standard archaeological tool for pottery analysis: practice and theory, its applica- tions as a practical tool to accompany and serve archaeological projects, did not reach Avshalom Karasik a,*, Uzy Smilansky b a The Institute of Archaeology, The Hebrew University, Mount Scopus

Smilansky, Uzy


Bridging Work Practice and System Design: Integrating Systemic Analysis, Appreciative Intervention and Practitioner Participation  

Microsoft Academic Search

This article discusses the integration of work practice and system design. By scrutinising the unfolding discourse of workshop participants the co-construction of work practice issues as relevant design considerations is described. Through a mutual exploration of ethnography and participatory design the contributing constituents to the co-construction process are identified and put forward as elements in the integration of `systemic analysis'

Helena Karasti



Stirling Analysis Comparison of Commercial vs. High-Order Methods  

NASA Technical Reports Server (NTRS)

Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/ proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's Compact scheme and Dyson s Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model although sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako



Results from three years of the world's largest interlaboratory comparison for total mercury and methylmercury: Method performance and best practices  

NASA Astrophysics Data System (ADS)

Brooks Rand Instruments has conducted the world's largest interlaboratory comparison study for total mercury and methylmercury in natural waters annually for three years. Each year, roughly 50 laboratories registered to participate and the majority of participants submitted results. Each laboratory was assigned a performance score based on the distance between its results and the consensus mean, as well as the precision of its replicate analyses. Participants were also asked to provide detailed data on their analytical methodology and equipment. We used the methodology data and performance scores to assess the performance of the various methods reported and equipment used. Although the majority of methods in use show no systematic trend toward poor analytical performance, there are noteworthy exceptions. We present results from each of the three years of the interlaboratory comparison exercise, as well as aggregated method performance data. We compare the methods used in this study to methods from other published interlaboratory comparison studies and present a list of recommended best practices. Our goals in creating a list of best practices are to maximize participation, ensure inclusiveness, minimize non-response bias, guarantee high data quality, and promote transparency of analysis. We seek to create a standardized methodology for interlaboratory comparison exercises for total mercury and methylmercury analysis in water, which will lead to more directly comparable results between studies. We show that in most cases, the coefficient of variation between labs measuring replicates of the same sample is greater than 20% after the removal of outlying data points (e.g. Figure 1). It is difficult to make comparisons between studies and ecosystems with such a high variability between labs. We highlight the need for regular participation in interlaboratory comparison studies and continuous analytical method improvement in order to ensure accurate data. Figure 1. Results from one sample analyzed in the 2013 Interlaboratory Comparison Study.

Creswell, J. E.; Engel, V.; Carter, A.; Davies, C.



A practical method for the detection of freezing of gait in patients with Parkinson's disease  

PubMed Central

Purpose Freezing of gait (FOG), increasing the fall risk and limiting the quality of life, is common at the advanced stage of Parkinson’s disease, typically in old ages. A simple and unobtrusive FOG detection system with a small calculation load would make a fast presentation of on-demand cueing possible. The purpose of this study was to find a practical FOG detection system. Patients and methods A sole-mounted sensor system was developed for an unobtrusive measurement of acceleration during gait. Twenty patients with Parkinson’s disease participated in this study. A simple and fast time-domain method for the FOG detection was suggested and compared with the conventional frequency-domain method. The parameters used in the FOG detection were optimized for each patient. Results The calculation load was 1,154 times less in the time-domain method than the conventional method, and the FOG detection performance was comparable between the two domains (P=0.79) and depended on the window length (P<0.01) and dimension of sensor information (P=0.03). Conclusion A minimally constraining sole-mounted sensor system was developed, and the suggested time-domain method showed comparable FOG detection performance to that of the conventional frequency-domain method. Three-dimensional sensor information and 3–4-second window length were desirable. The suggested system is expected to have more practical clinical applications. PMID:25336936

Kwon, Yuri; Park, Sang Hoon; Kim, Ji-Won; Ho, Yeji; Jeon, Hyeong-Min; Bang, Min-Jung; Jung, Gu-In; Lee, Seon-Min; Eom, Gwang-Moon; Koh, Seong-Beom; Lee, Jeong-Whan; Jeon, Heung Seok



A Practical Approach for Performance Analysis of Shared-Memory Programs  

E-print Network

and methodologies, as well as the availability of multicore systems. However, performance analysis of parallel pro% for UMA and 11% on NUMA. Our analysis shows that speedup loss is dominated by memory contentionA Practical Approach for Performance Analysis of Shared-Memory Programs Bogdan Marius Tudor

Teo, Yong-Meng


A Practical Blended Analysis for Dynamic Features in JavaScript  

E-print Network

Framework is designed to perform a general-purpose, practical combined static/dynamic analysis of Java. The idea of blended analysis is to focus static anal- ysis on a dynamic calling structure collected at runtime in a lightweight manner, and to refine the static analysis us- ing additional dynamic information

Ryder, Barbara G.


Language Ideology or Language Practice? An Analysis of Language Policy Documents at Swedish Universities  

ERIC Educational Resources Information Center

This article presents an analysis and interpretation of language policy documents from eight Swedish universities with regard to intertextuality, authorship and content analysis of the notions of language practices and English as a lingua franca (ELF). The analysis is then linked to Spolsky's framework of language policy, namely language…

Björkman, Beyza



Practice of Physical Activity among Future Doctors: A Cross Sectional Analysis  

PubMed Central

Background: Non communicable diseases (NCD) will account for 73% of deaths and 60% of the global disease burden by 2020. Physical activity plays a major role in the prevention of these non-communicable diseases. The stress involved in meeting responsibilities of becoming a physician may adversely affect the exercise habits of students. So, the current study aimed to study the practice of physical activity among undergraduate medical students. Methods: A cross sectional study was conducted among 240 undergraduate medical students. Quota sampling method was used to identify 60 students from each of the four even semesters. A pre-tested, semi-structured questionnaire was used to collect the data. Statistical Package for Social Sciences (SPSS) version 16 was used for data entry and analysis and results are expressed as percentages and proportions. Results: In our study, 55% were 20 to 22 years old. Over half of the students were utilizing the sports facilities provided by the university in the campus. Majority of students 165 (69%) had normal body mass index (BMI), (51) 21% were overweight, while 7 (3%) were obese. Of the 62% who were currently exercising, the practice of physical activity was more among boys as compared to girls (62% v/s 38%). Lack of time 46 (60.5%), laziness (61.8%), and exhaustion from academic activities (42%) were identified as important hindering factors among medical students who did not exercise. Conclusion: A longitudinal study to follow-up student behavior throughout their academic life is needed to identify the factors promoting the practice of physical activity among students. PMID:22708033

Rao, Chythra R; Darshan, BB; Das, Nairita; Rajan, Vinaya; Bhogun, Meemansha; Gupta, Aditya




Microsoft Academic Search

Office analysis is a technique for supporting the first stagein modern systems analysis and design, the invention phase. Theprocess involves first describing the activities that take placein a given office, focusing not on who is doing what with anobject, but rather on the high level information processingactivities that change or move the object's information content.After having described the activities, office

William C. Sasso; Judith Reitman Olson; Alan C. Merten




EPA Science Inventory

A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...


Thermal Analysis Methods for Aerobraking Heating  

NASA Technical Reports Server (NTRS)

As NASA begins exploration of other planets, a method of non-propulsively slowing vehicles at the planet, aerobraking, may become a valuable technique for managing vehicle design mass and propellant. An example of this is Mars Reconnaissance Orbiter (MRO), which will launch in late 2005 and reach Mars in March of 2006. In order to save propellant, MRO will use aerobraking to modify the initial orbit at Mars. The spacecraft will dip into the atmosphere briefly on each orbit, and during the drag pass, the atmospheric drag on the spacecraft will slow it, thus lowering the orbit apoapsis. The largest area on the spacecraft, and that most affected by the heat generated during the aerobraking process, is the solar arrays. A thermal analysis of the solar arrays was conducted at NASA Langley, to simulate their performance throughout the entire roughly 6-month period of aerobraking. Several interesting methods were used to make this analysis more rapid and robust. Two separate models were built for this analysis, one in Thermal Desktop for radiation and orbital heating analysis, and one in MSC.Patran for thermal analysis. The results from the radiation model were mapped in an automated fashion to the Patran thermal model that was used to analyze the thermal behavior during the drag pass. A high degree of automation in file manipulation as well as other methods for reducing run time were employed, since toward the end of the aerobraking period the orbit period is short, and in order to support flight operations the runs must be computed rapidly. All heating within the Patran Thermal model was combined in one section of logic, such that data mapped from the radiation model and aeroheating model, as well as skin temperature effects on the aeroheating and surface radiation, could be incorporated easily. This approach calculates the aeroheating at any given node, based on its position and temperature as well as the density and velocity at that trajectory point. Run times on several different processors, computer hard drives, and operating systems (Windows versus Linux) were evaluated.

Amundsen, Ruth M.; Gasbarre, Joseph F.; Dec, John A.



Analysis method for Fourier transform spectroscopy  

NASA Technical Reports Server (NTRS)

A fast Fourier transform technique is given for the simulation of those distortion effects in the instrument line shape of the interferometric spectrum that are due to errors in the measured interferogram. The technique is applied to analyses of atmospheric absorption spectra and laboratory spectra. It is shown that the nonlinear least squares method can retrieve the correct information from the distorted spectrum. Analyses of HF absorption spectra obtained in a laboratory and solar CO absorption spectra gathered by a balloon-borne interferometer indicate that the retrieved amount of absorbing gas is less than the correct value in most cases, if the interferogram distortion effects are not included in the analysis.

Park, J. H.



Method and apparatus for chromatographic quantitative analysis  


An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

Fritz, James S. (Ames, IA); Gjerde, Douglas T. (Ames, IA); Schmuckler, Gabriella (Haifa, IL)



Semiquantitative fluorescence method for bioconjugation analysis.  


Quantum dots (QDs) have been used as fluorescent probes in biological and medical fields such as bioimaging, bioanalytical, and immunofluorescence assays. For these applications, it is important to characterize the QD-protein bioconjugates. This chapter provides details on a versatile method to confirm quantum dot-protein conjugation including the required materials and instrumentation in order to perform the step-by-step semiquantitative analysis of the bioconjugation efficiency by using fluorescence plate readings. Although the protocols to confirm the QD-protein attachment shown here were developed for CdTe QDs coated with specific ligands and proteins, the principles are the same for other QDs-protein bioconjugates. PMID:25103803

Brasil, Aluízio G; Carvalho, Kilmara H G; Leite, Elisa S; Fontes, Adriana; Santos, Beate Saegesser



Numerical analysis method for linear induction machines.  

NASA Technical Reports Server (NTRS)

A numerical analysis method has been developed for linear induction machines such as liquid metal MHD pumps and generators and linear motors. Arbitrary phase currents or voltages can be specified and the moving conductor can have arbitrary velocity and conductivity variations from point to point. The moving conductor is divided into a mesh and coefficients are calculated for the voltage induced at each mesh point by unit current at every other mesh point. Combining the coefficients with the mesh resistances yields a set of simultaneous equations which are solved for the unknown currents.

Elliott, D. G.



Child survivorship estimation: methods and data analysis.  


"The past 20 years have seen extensive elaboration, refinement, and application of the original Brass method for estimating infant and child mortality from child survivorship data. This experience has confirmed the overall usefulness of the methods beyond question, but it has also shown that...estimates must be analyzed in relation to other relevant information before useful conclusions about the level and trend of mortality can be drawn.... This article aims to illustrate the importance of data analysis through a series of examples, including data for the Eastern Malaysian state of Sarawak, Mexico, Thailand, and Indonesia. Specific maneuvers include plotting completed parity distributions and 'time-plotting' mean numbers of children ever born from successive censuses. A substantive conclusion of general interest is that data for older women are not so widely defective as generally supposed." PMID:12343438

Feeney, G



Breastfeeding practices in a public health field practice area in Sri Lanka: a survival analysis  

Microsoft Academic Search

BACKGROUND: Exclusive breastfeeding up to the completion of the sixth month of age is the national infant feeding recommendation for Sri Lanka. The objective of the present study was to collect data on exclusive breastfeeding up to six months and to describe the association between exclusive breastfeeding and selected socio-demographic factors. METHODS: A clinic based cross-sectional study was conducted in

Suneth B Agampodi; Thilini C Agampodi; Udage Kankanamge D Piyaseeli



Methods of periodicity analysis - Relationship between the Rayleigh analysis and a maximum likelihood method  

NASA Technical Reports Server (NTRS)

For periodicity analysis of occurrence rates of discrete events, one can use the maximum likelihood method or the 'Rayleigh analysis'. In a maximum likelihood analysis using a sinusoidal distribution, one tries various values of amplitude A and phase angle Theta(0) of the distribution function. We show that these two methods are essentially equivalent to one another in spite of their different mathematical origins. Using the Rayleigh analysis, therefore, we can simply calculate A and Theta(0) which maximize the likelihood. Using the cumulative nature of the logarithmic likelihood, we can identify time intervals during which the periodicity is in operation. When a periodicity operates only in certain time intervals, it is important to identify these intervals. Mathematically, the above technique is applicable only to discrete events. However, with slight modifications we can apply this technique to general cases - periodicity analysis of measurements of continuously varying quantities.

Bai, T.



Practical Analysis of a New Type Radiant Heating Technology in a Large Space Building  

E-print Network

ICEBO2006, Shenzhen, China Heating technologies fo r energy efficiency Vol.III-3-4 Practical Analysis of a New Type Radiant Heating Technology in a Large Space Building Guohui Feng Guangyu Cao Li Gang Ph.D. Ph...ICEBO2006, Shenzhen, China Heating technologies fo r energy efficiency Vol.III-3-4 Practical Analysis of a New Type Radiant Heating Technology in a Large Space Building Guohui Feng Guangyu Cao Li Gang Ph.D. Ph...

Feng, G.; Cao, G.; Gang, L.



Influence of Analysis Methods on Interpretation of Hazard Maps  

PubMed Central

Exposure or hazard mapping is becoming increasingly popular among industrial hygienists. Direct-reading instruments used for hazard mapping of data collection are steadily increasing in reliability and portability while decreasing in cost. Exposure measurements made with these instruments generally require no laboratory analysis although hazard mapping can be a time-consuming process. To inform decision making by industrial hygienists and management, it is crucial that the maps generated from mapping data are as accurate and representative as possible. Currently, it is unclear how many sampling locations are necessary to produce a representative hazard map. As such, researchers typically collect as many points as can be sampled in several hours and interpolation methods are used to produce higher resolution maps. We have reanalyzed hazard-mapping data sets from three industrial settings to determine which interpolation methods yield the most accurate results. The goal is to provide practicing industrial hygienists with some practical guidelines to generate accurate hazard maps with ‘off-the-shelf’ mapping software. Visually verifying the fit of the variogram model is crucial for accurate interpolation. Exponential and spherical variogram models performed better than Gaussian models. It was also necessary to diverge from some of the default interpolation parameters such as the number of bins used for the experimental variogram and whether or not to allow for a nugget effect to achieve reasonable accuracy of the interpolation for some data sets. PMID:23258453

Koehler, Kirsten A.



Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2  

NASA Technical Reports Server (NTRS)

The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

Johnson, Kenneth L.; White, K. Preston, Jr.



How nurse teachers keep up-to-date: their methods and practices  

Microsoft Academic Search

A total of 316 nurse teachers completed a self-administered questionnaire in an investigation of the methods and practices they use to keep up-to-date. The study was in two stages. In the first stage, 240 respondents were asked to rate 13 activities as indispensable, useful or of minimal value for updating. Responses were not found to depend significantly on qualifications or

Christine Love



78 FR 24691 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...  

Federal Register 2010, 2011, 2012, 2013

...Manufacturing Practice and Hazard Analysis and Risk- Based Preventive...Manufacturing Practice and Hazard Analysis and Risk-Based Preventive...Center for Food Safety and Applied Nutrition (HFS-300), Food...Manufacturing Practice and Hazard Analysis and Risk-Based...



A practical valence bond method: a configuration interaction method approach with perturbation theoretic facility.  


The previously developed valence bond configuration interaction (VBCI) method (Wu, W.; Song, L.; Cao, Z.; Zhang, Q.; Shaik, S., J. Phys. Chem. A, 2002, 105, 2721) that borrows the general CI philosophy of the MO theory, is further extended in this article, and its methodological features are improved, resulting in three accurate and cost-effective procedures: (a) the effect of quadruplet excitation is incorporated using the Davidson correction, such that the new procedure reduces size consistency problems, with due improvement in the quality of the computational results. (b) A cost-effective procedure, named VBCI(D, S), is introduced. It includes doubly excited structures for active electrons and singly excited structures for inactive pairs. The computational results of VBCI(D, S) match those of VBCISD with much less computational effort than VBCISD. (c) Finally, a second-order perturbation theory is utilized as a means of configuration selection, and lead to considerable reduction of the computational cost, with little or no loss in accuracy. Applications of the new procedures to bond energies and barriers of chemical reactions are presented and discussed. PMID:14735567

Song, Lingchun; Wu, Wei; Zhang, Qianer; Shaik, Sason



Cleanup standards and pathways analysis methods  

SciTech Connect

Remediation of a radioactively contaminated site requires that certain regulatory criteria be met before the site can be released for unrestricted future use. Since the ultimate objective of remediation is to protect the public health and safety, residual radioactivity levels remaining at a site after cleanup must be below certain preset limits or meet acceptable dose or risk criteria. Release of a decontaminated site requires proof that the radiological data obtained from the site meet the regulatory criteria for such a release. Typically release criteria consist of a composite of acceptance limits that depend on the radionuclides, the media in which they are present, and federal and local regulations. In recent years, the US Department of Energy (DOE) has developed a pathways analysis model to determine site-specific soil activity concentration guidelines for radionuclides that do not have established generic acceptance limits. The DOE pathways analysis computer code (developed by Argonne National Laboratory for the DOE) is called RESRAD (Gilbert et al. 1989). Similar efforts have been initiated by the US Nuclear Regulatory Commission (NRC) to develop and use dose-related criteria based on genetic pathways analyses rather than simplistic numerical limits on residual radioactivity. The focus of this paper is radionuclide contaminated soil. Cleanup standards are reviewed, pathways analysis methods are described, and an example is presented in which RESRAD was used to derive cleanup guidelines.

Devgun, J.S.



The evolution of nursing in Australian general practice: a comparative analysis of workforce surveys ten years on  

PubMed Central

Background Nursing in Australian general practice has grown rapidly over the last decade in response to government initiatives to strengthen primary care. There are limited data about how this expansion has impacted on the nursing role, scope of practice and workforce characteristics. This study aimed to describe the current demographic and employment characteristics of Australian nurses working in general practice and explore trends in their role over time. Methods In the nascence of the expansion of the role of nurses in Australian general practice (2003–2004) a national survey was undertaken to describe nurse demographics, clinical roles and competencies. This survey was repeated in 2009–2010 and comparative analysis of the datasets undertaken to explore workforce changes over time. Results Two hundred eighty four nurses employed in general practice completed the first survey (2003/04) and 235 completed the second survey (2009/10). Significantly more participants in Study 2 were undertaking follow-up of pathology results, physical assessment and disease specific health education. There was also a statistically significant increase in the participants who felt that further education/training would augment their confidence in all clinical tasks (p?practice decreased between the two time points, more participants perceived lack of space, job descriptions, confidence to negotiate with general practitioners and personal desire to enhance their role as barriers. Access to education and training as a facilitator to nursing role expansion increased between the two studies. The level of optimism of participants for the future of the nurses’ role in general practice was slightly decreased over time. Conclusions This study has identified that some of the structural barriers to nursing in Australian general practice have been addressed over time. However, it also identifies continuing barriers that impact practice nurse role development. Understanding and addressing these issues is vital to optimise the effectiveness of the primary care nursing workforce. PMID:24666420



Chapter 11. Community analysis-based methods  

SciTech Connect

Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.



A concise method for mine soils analysis  

SciTech Connect

A large number of abandoned hard rock mines exist in Colorado and other mountain west states, many on public property. Public pressure and resulting policy changes have become a driving force in the reclamation of these sites. Two of the key reclamation issues for these sites in the occurrence of acid forming materials (AFMs) in mine soils, and acid mine drainage (AMD) issuing from mine audits. An AMD treatment system design project for the Forest Queen mine in Colorado's San Juan mountains raised the need for a simple, useable method for analysis of mine land soils, both for suitability as a construction material, and to determine the AFM content and potential for acid release. The authors have developed a simple, stepwise, go - no go test for the analysis of mine soils. Samples were collected from a variety of sites in the Silverton, CO area, and subjected to three tiers of tests including: paste pH, Eh, and 10% HCl fizz test; then total digestion in HNO{sub 3}/HCl, neutralization potential, exposure to meteoric water, and toxicity content leaching procedure (TCLP). All elemental analyses were performed with an inductively-coupled plasma (ICP) spectrometer. Elimination of samples via the first two testing tiers left two remaining samples, which were subsequently subjected to column and sequential batch tests, with further elemental analysis by ICP. Based on these tests, one sample was chosen for suitability as a constructing material for the Forest Queen treatment system basins. Further simplification, and testing on two pairs of independent soil samples, has resulted in a final analytical method suitable for general use.

Winkler, S.; Wildeman, T.; Robinson, R.; Herron, J.



Analysis of methods. [information systems evolution environment  

NASA Technical Reports Server (NTRS)

Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity Relationship, Data Flow Diagrams, and Structure Charts, for inclusion in an integrated development support environment.

Mayer, Richard J. (editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.



Practical method for PCB degradation using Pd/C-H2-Mg system.  


Polychlorinated biphenyls (PCBs) were mainly used as lubricants and coolants in electrical equipment. However, their chemical stabilities as well as hydrophobic properties caused persistent environmental pollution and damage to human health based on their bioaccumulative property. PCBs are currently targeted for worldwide elimination and should be disposed by 2028 based on the Stockholm Convention on Persistent Organic Pollutants. The conventional PCB degradation methods require high-heat, high-pressure or/and strongly basic conditions. The development of a safer and more practical method, therefore, is desired. We have reported a catalytic degradation method of PCBs based on a palladium on carbon (Pd/C)-catalyzed dechlorination in the presence of Et(3)N under ambient hydrogen pressure and temperature. In this study, we demonstrate a more practical system using magnesium metal instead of Et(3)N for the dechlorination of a variety of aromatic chlorides. The method was applicable for the complete degradation of a variety of PCB mixtures, such as Aroclor 1242, 1248, 1254 and PCBs removed from a capacitor to produce only biphenyl and magnesium chloride as the maritime component, both of which are less toxic and easily separable. Moreover, the Pd/C could be recovered and reused at least five times without any loss of catalytic activity. The present Pd/C-Mg-H(2) system is a simple, safe, inexpensive, and environmentally-benign degradation method of PCBs. PMID:22939897

Ido, Akiko; Ishihara, Shinji; Kume, Akira; Nakanishi, Tsuyoshi; Monguchi, Yasunari; Sajiki, Hironao; Nagase, Hisamitsu



Analysis Method for Quantifying Vehicle Design Goals  

NASA Technical Reports Server (NTRS)

A document discusses a method for using Design Structure Matrices (DSM), coupled with high-level tools representing important life-cycle parameters, to comprehensively conceptualize a flight/ground space transportation system design by dealing with such variables as performance, up-front costs, downstream operations costs, and reliability. This approach also weighs operational approaches based on their effect on upstream design variables so that it is possible to readily, yet defensively, establish linkages between operations and these upstream variables. To avoid the large range of problems that have defeated previous methods of dealing with the complex problems of transportation design, and to cut down the inefficient use of resources, the method described in the document identifies those areas that are of sufficient promise and that provide a higher grade of analysis for those issues, as well as the linkages at issue between operations and other factors. Ultimately, the system is designed to save resources and time, and allows for the evolution of operable space transportation system technology, and design and conceptual system approach targets.

Fimognari, Peter; Eskridge, Richard; Martin, Adam; Lee, Michael



Using Mixed-Methods Sequential Explanatory Design: From Theory to Practice  

Microsoft Academic Search

This article discusses some procedural issues related to the mixed-methods sequential explanatory design, which implies collecting and analyzing quantitative and then qualitative data in two consecutive phases within one study. Such issues include deciding on the priority or weight given to the quantitative and qualitative data collection and analysis in the study, the sequence of the data collection and analysis,

Nataliya V. Ivankova; John W. Creswell; Sheldon L. Stick



Analysis of post audits for Gulf of Mexico completions leads to continuous improvement in completion practices  

SciTech Connect

Final production rate alone is not an adequate measure of the success of a well completion. Rather, we must estimate the {open_quotes}potential{close_quotes} of a reservoir and judge the ultimate success of a completion on how close we come to achieving this potential. Specific productivity indexes (SPI`s - BFPD/(PSI*FT)), specific injectivity indexes SII`s - (BFPD/(PSI*FT)), and completion efficiencies (CE`s -percent of Darcy radial flow) can be calculated at various times throughout a well completion. Analysis of these data quantifies the efficiency of the completion after each individual completion operation, allowing a determination of the effects of each completion practice to be made. In addition to completion efficiency data, a comparison of gravel placement volumes behind casing helps quantify optimum gravel packing procedures. Twenty-two Gulf of Mexico completions have been analyzed using this technique. This paper will detail the results of this analysis, in particular the productivity effects of various methods of underbalanced perforating, gravel packing, and well control. Items of discussion include: the effects of underbalanced perforating on well performance, the effects of flowback after perforating on perforation tunnel cleaning, productivity impacts of various types of well control methods following perforating and gravel packing, and comparisons of gravel pack design parameters and gravel placement behind casing.

Pashen, M.A.; McLeod, H.O. Jr.



The Analysis of Athletic Performance: Some Practical and Philosophical Considerations  

ERIC Educational Resources Information Center

This article presents a hypothetical dialogue between a notational analyst (NA) recently schooled in the positivistic assessment of athletic performance, an "old-school" traditional coach (TC) who favours subjective analysis, and a pragmatic educator (PE). The conversation opens with NA and TC debating the respective value of quantitative and…

Nelson, Lee J.; Groom, Ryan



Modular, Higher-Order Cardinality Analysis in Theory and Practice  

E-print Network

))) A compiler like GHC can now use short-cut deforestation to fuse the two maps into one, eliminating that it can be shared. That would be good for wurble1 but bad for wurble2. What is needed is an analysis

Jones, Simon Peyton


Practical Analysis of Almost Circular Tensile Cracks under Uniform Loading  

Microsoft Academic Search

We study two different approaches to the evaluation of the stress intensity factors (SIF) for circular cracks one of which is based on the Rice integral formula and the second (Panasyuk) approach deals with the analysis of the stressed state outside an imaginary circular crack. It is shown that the Panasyuk approach is more accurate from the mathematical point of

I. V. Orynyak; D. O. Harris; A. V. Kamenchuk



An Analysis of Ethical Considerations in Programme Design Practice  

ERIC Educational Resources Information Center

Ethical considerations are inherent to programme design decision-making, but not normally explicit. Nonetheless, they influence whose interests are served in a programme and who benefits from it. This paper presents an analysis of ethical considerations made by programme design practitioners in the context of a polytechnic in Aotearoa/New Zealand.…

Govers, Elly



Regional Economic Development: An Analysis of Practices, Resources and Outcomes  

E-print Network

--SWOT Index Scores by Question (Virginia) xi #12;The Thomas Jefferson Program in Public Policy Brett Levanto for a Strengths/Weaknesses/Opportunities/Threats (SWOT) analysis of Economic Development organizations, which can long existed in the United States. Just after the Revolution, these simplistic government and non-government

Lewis, Robert Michael


Newborn Hearing Screening: An Analysis of Current Practices  

ERIC Educational Resources Information Center

State coordinators of early hearing detection and intervention (EHDI) programs completed a strengths, weaknesses, opportunities, and threats, or SWOT, analysis that consisted of 12 evaluative areas of EHDI programs. For the newborn hearing screening area, a total of 293 items were listed by 49 EHDI coordinators, and themes were identified within…

Houston, K. Todd; Bradham, Tamala S.; Munoz, Karen F.; Guignard, Gayla Hutsell



Knowledge, attitude and practice related to infant feeding among women in rural Papua New Guinea: a descriptive, mixed method study  

PubMed Central

Background Despite the well-recognized effectiveness of exclusive breastfeeding for the first six months of an infant life for reducing infant mortality, adherence to this practice is not widespread in the developing world. Although several studies on infant nutrition practices have been conducted in urban settings of Papua New Guinea (PNG), there is only scant information on infant feeding practices in rural settings. Therefore, this study aimed to investigate knowledge, attitude and practice associated with exclusive breastfeeding in various locations in rural PNG. Methods A mixed method study using interviews based on a semi-structured questionnaire (n?=?140) and Focus Group Discussions (FGDs) was conducted among mothers in rural PNG between August and September 2012. Participants were selected using convenience sampling. Included in the study were both primiparous and multiparous mothers with a child below the age of two years. Content analysis was used for qualitative data and descriptive statistics were used for quantitative data. Results Whereas most women indicated breastfeeding as a better way to feed babies, knowledge of the reasons for its superiority over infant formula was generally poor. Only 17% of mothers practiced exclusive breastfeeding for the first six months postpartum. Our study showed that the size of the gap between exclusive breastfeeding practice and global recommendations was striking. Taking into account the low educational profile of the participants, the disparity may be explained by the fact that most of the mothers in this study had no formal education on infant feeding. Conclusions This study showed a lack of understanding of the importance of and poor adherence to exclusive breastfeeding for the first six months postpartum among rural mothers. As exclusive breastfeeding promotion has been proved to be one of most effective ways to improve infant survival, more attention should be given to it, especially targeting the large proportion of women who missed formal education on infant feeding in school. A proper community-based program including the tools for monitoring its implementation and effectiveness needs to be developed to transform policy recommendations into action in rural PNG. PMID:24257483



Deriving a practical analytical-probabilistic method to size flood routing reservoirs  

NASA Astrophysics Data System (ADS)

In the engineering practice routing reservoir sizing is commonly performed by using the design storm method, although its effectiveness has been debated for a long time. Conversely, continuous simulations and direct statistical analyses of recorded hydrographs are considered more reliable and comprehensive, but are indeed complex or seldom practicable. In this paper a handier tool is provided by the analytical-probabilistic approach to construct probability functions of peak discharges issuing from natural watersheds or routed through on-line and off-line reservoirs. A simplified routing scheme and a rainfall-runoff model based on a few essential hydrological parameters were implemented. To validate the proposed design methodology, on-line and off-line routing reservoirs were firstly sized by means of a conventional design storm method for a test watershed located in northern Italy. Their routing efficiencies were then estimated by both analytical-probabilistic models and benchmarking continuous simulations. Bearing in mind practical design purposes, adopted models evidenced a satisfactory consistency.

Balistrocchi, Matteo; Grossi, Giovanna; Bacchi, Baldassare



Are parents' knowledge and practice regarding immunization related to pediatrics' immunization compliance? a mixed method study  

PubMed Central

Background Immunization rate is one of the best public health outcome and service indicators of the last 100 years. Parental decisions regarding immunization are very important to improve immunization rate. The aim of this study was to evaluate the correlation between parental knowledge-practices (KP) and children's immunization completeness. Methods A mixed method has been utilized in this study: a retrospective cohort study was used to evaluate immunization completeness; a prospective cross-sectional study was used to evaluate immunization KP of parents. 528 children born between 1 January 2003 and 31 June 2008 were randomly selected from five public health clinics in Mosul, Iraq. Immunization history of each child was collected retrospectively from their immunization record/card. Results About half of studied children (n?=?286, 56.3%) were immunized with all vaccination doses; these children were considered as having had complete immunization. 66.1% of the parents was found to have adequate KP scores. A significant association of immunization completeness with total KP groups (p?practice. The study results reinforce recommendations for the periodic assessment of immunization rate and the use of educational programmes to improve the immunization rate, knowledge and practice. PMID:24460878



Computational Neuroimaging: Analysis methods W12-PSYCH-204B-01  

E-print Network

; Experimental design and analysis methods Advanced methods: High resolution fMRI. Multivoxel pattern analyses, fComputational Neuroimaging: Analysis methods W12-PSYCH-204B-01 Basic MR physics and BOLD signals focusing on understanding analysis methods for neuroimaging data using real and simulated data sets. Topics

Grill-Spector, Kalanit


Limitations in simulator time-based human reliability analysis methods  

SciTech Connect

Developments in human reliability analysis (HRA) methods have evolved slowly. Current methods are little changed from those of almost a decade ago, particularly in the use of time-reliability relationships. While these methods were suitable as an interim step, the time (and the need) has come to specify the next evolution of HRA methods. As with any performance-oriented data source, power plant simulator data have no direct connection to HRA models. Errors reported in data are normal deficiencies observed in human performance; failures are events modeled in probabilistic risk assessments (PRAs). Not all errors cause failures; not all failures are caused by errors. Second, the times at which actions are taken provide no measure of the likelihood of failures to act correctly within an accident scenario. Inferences can be made about human reliability, but they must be made with great care. Specific limitations are discussed. Simulator performance data are useful in providing qualitative evidence of the variety of error types and their potential influences on operating systems. More work is required to combine recent developments in the psychology of error with the qualitative data collected at stimulators. Until data become openly available, however, such an advance will not be practical.

Wreathall, J.



Practical guidance for statistical analysis of operational event data  

SciTech Connect

This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

Atwood, C.L.



Gap analysis: Concepts, methods, and recent results  

USGS Publications Warehouse

Rapid progress is being made in the conceptual, technical, and organizational requirements for generating synoptic multi-scale views of the earth's surface and its biological content. Using the spatially comprehensive data that are now available, researchers, land managers, and land-use planners can, for the first time, quantitatively place landscape units - from general categories such as 'Forests' or 'Cold-Deciduous Shrubland Formation' to more categories such as 'Picea glauca-Abies balsamea-Populus spp. Forest Alliance' - in their large-area contexts. The National Gap Analysis Program (GAP) has developed the technical and organizational capabilities necessary for the regular production and analysis of such information. This paper provides a brief overview of concepts and methods as well as some recent results from the GAP projects. Clearly, new frameworks for biogeographic information and organizational cooperation are needed if we are to have any hope of documenting the full range of species occurrences and ecological processes in ways meaningful to their management. The GAP experience provides one model for achieving these new frameworks.

Jennings, M.D.



Measuring Racial/Ethnic Disparities in Health Care: Methods and Practical Issues  

PubMed Central

Objective To review methods of measuring racial/ethnic health care disparities. Study Design Identification and tracking of racial/ethnic disparities in health care will be advanced by application of a consistent definition and reliable empirical methods. We have proposed a definition of racial/ethnic health care disparities based in the Institute of Medicine's (IOM) Unequal Treatment report, which defines disparities as all differences except those due to clinical need and preferences. After briefly summarizing the strengths and critiques of this definition, we review methods that have been used to implement it. We discuss practical issues that arise during implementation and expand these methods to identify sources of disparities. We also situate the focus on methods to measure racial/ethnic health care disparities (an endeavor predominant in the United States) within a larger international literature in health outcomes and health care inequality. Empirical Application We compare different methods of implementing the IOM definition on measurement of disparities in any use of mental health care and mental health care expenditures using the 2004–2008 Medical Expenditure Panel Survey. Conclusion Disparities analysts should be aware of multiple methods available to measure disparities and their differing assumptions. We prefer a method concordant with the IOM definition. PMID:22353147

Cook, Benjamin Le; McGuire, Thomas G; Zaslavsky,, Alan M



Image, measure, figure: a critical discourse analysis of nursing practices that develop children.  


Motivated by discourses that link early child development and health, nurses engage in seemingly benign surveillance of children. These practices are based on knowledge claims and technologies of developmental science, which remain anchored in assumptions of the child body as an incomplete form with a universal developmental trajectory and inherent potentiality. This paper engages in a critical discursive analysis, drawing on Donna Haraway's conceptualizations of technoscience and figuration. Using a contemporary developmental screening tool from nursing practice, this analysis traces the effects of this tool through production, transformation, distribution, and consumption. It reveals how the techniques of imaging, abstraction, and measurement collide to fix the open, transformative child body in a figuration of the developing child. This analysis also demonstrates how technobiopower infuses nurses' understandings of children and structures developmentally appropriate expectations for children, parents, and nurses. Furthermore, it describes how practices that claim to facilitate healthy child development may inversely deprive children of agency and foster the production of normal or ideal children. An alternative ontological perspective is offered as a challenge to the individualism of developmental models and other dominant ideologies of development, as well as practices associated with these ideologies. In summary, this analysis argues that nurses must pay closer attention to how technobiopower infuses practices that monitor and promote child development. Fostering a critical understanding of the harmful implications of these practices is warranted and offers the space to conceive of human development in alternate and exciting ways. PMID:23745662

Einboden, Rochelle; Rudge, Trudy; Varcoe, Colleen



Method and apparatus for frequency spectrum analysis  

NASA Technical Reports Server (NTRS)

A method for frequency spectrum analysis of an unknown signal in real-time is discussed. The method is based upon integration of 1-bit samples of signal voltage amplitude corresponding to sine or cosine phases of a controlled center frequency clock which is changed after each integration interval to sweep the frequency range of interest in steps. Integration of samples during each interval is carried out over a number of cycles of the center frequency clock spanning a number of cycles of an input signal to be analyzed. The invention may be used to detect the frequency of at least two signals simultaneously. By using a reference signal of known frequency and voltage amplitude (added to the two signals for parallel processing in the same way, but in a different channel with a sampling at the known frequency and phases of the reference signal), the absolute voltage amplitude of the other two signals may be determined by squaring the sine and cosine integrals of each channel and summing the squares to obtain relative power measurements in all three channels and, from the known voltage amplitude of the reference signal, obtaining an absolute voltage measurement for the other two signals by multiplying the known voltage of the reference signal with the ratio of the relative power of each of the other two signals to the relative power of the reference signal.

Cole, Steven W. (inventor)



Alignment of patient and primary care practice member perspectives of chronic illness care: a cross-sectional analysis  

PubMed Central

Background Little is known as to whether primary care teams’ perceptions of how well they have implemented the Chronic Care Model (CCM) corresponds with their patients’ own experience of chronic illness care. We examined the extent to which practice members’ perceptions of how well they organized to deliver care consistent with the CCM were associated with their patients’ perceptions of the chronic illness care they have received. Methods Analysis of baseline measures from a cluster randomized controlled trial testing a practice facilitation intervention to implement the CCM in small, community-based primary care practices. All practice “members” (i.e., physician providers, non-physician providers, and staff) completed the Assessment of Chronic Illness Care (ACIC) survey and adult patients with 1 or more chronic illnesses completed the Patient Assessment of Chronic Illness Care (PACIC) questionnaire. Results Two sets of hierarchical linear regression models accounting for nesting of practice members (N?=?283) and patients (N?=?1,769) within 39 practices assessed the association between practice member perspectives of CCM implementation (ACIC scores) and patients’ perspectives of CCM (PACIC). ACIC summary score was not significantly associated with PACIC summary score or most of PACIC subscale scores, but four of the ACIC subscales [Self-management Support (p?practice member perspectives when evaluating quality of chronic illness care. Trial registration NCT00482768 PMID:24678983



A Method and Its Practice for Teaching the Fundamental Technology of Communication Protocols and Coding  

NASA Astrophysics Data System (ADS)

The education of information and communication technologies is important for engineering, and it includes terminals, communication media, transmission, switching, software, communication protocols, coding, etc. The proposed teaching method for protocols is based on the HDLC (High-level Data Link Control) procedures using our newly developed software “HDLC trainer” , and includes the extensions for understanding other protocols such as TCP/IP. As for teaching the coding theory that is applied for the error control in protocols, we use both of a mathematical programming language and a general-purpose programming language. We have practiced and evaluated the proposed teaching method in our college, and it is shown that the method has remarkable effects for understanding the fundamental technology of protocols and coding.

Kobayashi, Tetsuji


Degradation of learned skills. Effectiveness of practice methods on simulated space flight skill retention  

NASA Technical Reports Server (NTRS)

Manual flight control and emergency procedure task skill degradation was evaluated after time intervals of from 1 to 6 months. The tasks were associated with a simulated launch through the orbit insertion flight phase of a space vehicle. The results showed that acceptable flight control performance was retained for 2 months, rapidly deteriorating thereafter by a factor of 1.7 to 3.1 depending on the performance measure used. Procedural task performance showed unacceptable degradation after only 1 month, and exceeded an order of magnitude after 4 months. The effectiveness of static rehearsal (checklists and briefings) and dynamic warmup (simulator practice) retraining methods were compared for the two tasks. Static rehearsal effectively countered procedural skill degradation, while some combination of dynamic warmup appeared necessary for flight control skill retention. It was apparent that these differences between methods were not solely a function of task type or retraining method, but were a function of the performance measures used for each task.

Sitterley, T. E.; Berge, W. A.



Homotopy analysis method for quadratic Riccati differential equation  

Microsoft Academic Search

In this paper, the quadratic Riccati differential equation is solved by means of an analytic technique, namely the homotopy analysis method (HAM). Comparisons are made between Adomian’s decomposition method (ADM), homotopy perturbation method (HPM) and the exact solution and the homotopy analysis method. The results reveal that the proposed method is very effective and simple.

Yue Tan; Saeid Abbasbandy



Differential method of analysis of luminescence spectra of semiconductors  

SciTech Connect

A method for analyzing the luminescence spectra of semiconductors is suggested. The method is based on differentiation of the spectra. The potentialities of the method are demonstrated for luminescence in the region of the fundamental absorption edge of Si and SiGe alloy single crystals. The method is superior in accuracy to previously known luminescence methods of determining the band gap of indirect-gap semiconductors and practically insensitive to different conditions of outputting radiation from the sample.

Emel'yanov, A. M., E-mail: [Russian Academy of Sciences, Ioffe Physical Technical Institute (Russian Federation)



Primary prevention in general practice - views of German general practitioners: a mixed-methods study  

PubMed Central

Background Policy efforts focus on a reorientation of health care systems towards primary prevention. To guide such efforts, we analyzed the role of primary prevention in general practice and general practitioners’ (GPs) attitudes toward primary prevention. Methods Mixed-method study including a cross-sectional survey of all community-based GPs and focus groups in a sample of GPs who collaborated with the Institute of General Practice in Berlin, Germany in 2011. Of 1168 GPs 474 returned the mail survey. Fifteen GPs participated in focus group discussions. Survey and interview guidelines were developed and tested to assess and discuss beliefs, attitudes, and practices regarding primary prevention. Results Most respondents considered primary prevention within their realm of responsibility (70%). Primary prevention, especially physical activity, healthy eating, and smoking cessation, was part of the GPs’ health care recommendations if they thought it was indicated. Still a quarter of survey respondents discussed reduction of alcohol consumption with their patients infrequently even when they thought it was indicated. Similarly 18% claimed that they discuss smoking cessation only sometimes. The focus groups revealed that GPs were concerned about the detrimental effects an uninvited health behavior suggestion could have on patients and were hesitant to take on the role of “health policing”. GPs saw primary prevention as the responsibility of multiple actors in a network of societal and municipal institutions. Conclusions The mixed-method study showed that primary prevention approaches such as lifestyle counseling is not well established in primary care. GPs used a selective approach to offer preventive advice based upon indication. GPs had a strong sense that a universal prevention approach carried the potential to destroy a good patient-physician relationship. Other approaches to public health may be warranted such as a multisectoral approach to population health. This type of restructuring of the health care sector may benefit patients who are unable to afford specific prevention programmes and who have competing demands that hinder their ability to focus on behavior change. PMID:24885100



A situated practice of ethics for participatory visual and digital methods in public health research and practice: a focus on digital storytelling.  


This article explores ethical considerations related to participatory visual and digital methods for public health research and practice, through the lens of an approach known as "digital storytelling." We begin by briefly describing the digital storytelling process and its applications to public health research and practice. Next, we explore 6 common challenges: fuzzy boundaries, recruitment and consent to participate, power of shaping, representation and harm, confidentiality, and release of materials. We discuss their complexities and offer some considerations for ethical practice. We hope this article serves as a catalyst for expanded dialogue about the need for high standards of integrity and a situated practice of ethics wherein researchers and practitioners reflexively consider ethical decision-making as part of the ongoing work of public health. PMID:23948015

Gubrium, Aline C; Hill, Amy L; Flicker, Sarah



The influence of deliberate practice on musical achievement: a meta-analysis.  


Deliberate practice (DP) is a task-specific structured training activity that plays a key role in understanding skill acquisition and explaining individual differences in expert performance. Relevant activities that qualify as DP have to be identified in every domain. For example, for training in classical music, solitary practice is a typical training activity during skill acquisition. To date, no meta-analysis on the quantifiable effect size of deliberate practice on attained performance in music has been conducted. Yet the identification of a quantifiable effect size could be relevant for the current discussion on the role of various factors on individual difference in musical achievement. Furthermore, a research synthesis might enable new computational approaches to musical development. Here we present the first meta-analysis on the role of deliberate practice in the domain of musical performance. A final sample size of 13 studies (total N = 788) was carefully extracted to satisfy the following criteria: reported durations of task-specific accumulated practice as predictor variables and objectively assessed musical achievement as the target variable. We identified an aggregated effect size of r c = 0.61; 95% CI [0.54, 0.67] for the relationship between task-relevant practice (which by definition includes DP) and musical achievement. Our results corroborate the central role of long-term (deliberate) practice for explaining expert performance in music. PMID:25018742

Platz, Friedrich; Kopiez, Reinhard; Lehmann, Andreas C; Wolf, Anna



International Commercial Remote Sensing Practices and Policies: A Comparative Analysis  

NASA Astrophysics Data System (ADS)

In recent years, there has been much discussion about U.S. commercial remoteUnder the Act, the Secretary of Commerce sensing policies and how effectively theylicenses the operations of private U.S. address U.S. national security, foreignremote sensing satellite systems, in policy, commercial, and public interests.consultation with the Secretaries of Defense, This paper will provide an overview of U.S.State, and Interior. PDD-23 provided further commercial remote sensing laws,details concerning the operation of advanced regulations, and policies, and describe recentsystems, as well as criteria for the export of NOAA initiatives. It will also addressturnkey systems and/or components. In July related foreign practices, and the overall2000, pursuant to the authority delegated to legal context for trade and investment in thisit by the Secretary of Commerce, NOAA critical industry.iss ued new regulations for the industry. Licensing and Regulationsatellite systems. NOAA's program is The 1992 Land Remote Sensing Policy Act ("the Act"), and the 1994 policy on Foreign Access to Remote Sensing Space Capabilities (known as Presidential Decision Directive-23, or PDD-23) put into place an ambitious legal and policy framework for the U.S. Government's licensing of privately-owned, high-resolution satellite systems. Previously, capabilities afforded national security and observes the international obligations of the United States; maintain positive control of spacecraft operations; maintain a tasking record in conjunction with other record-keeping requirements; provide U.S. Government access to and use of data when required for national security or foreign policy purposes; provide for U.S. Government review of all significant foreign agreements; obtain U.S. Government approval for any encryption devices used; make available unenhanced data to a "sensed state" as soon as such data are available and on reasonable cost terms and conditions; make available unenhanced data as requested by the U.S. Government Archive; and, obtain a priori U.S. Government approval of all plans and procedures to deal with safe disposition of the satellite. Further information on NOAA's regulations and NOAA's licensing program is available at Monitoring and Enforcement NOAA's enforcement mission is focused on the legislative mandate which states that the Secretary of Commerce has a continuing obligation to ensure that licensed imaging systems are operated lawfully to preserve the national security and foreign policies of the United States. NOAA has constructed an end-to-end monitoring and compliance program to review the activities of licensed companies. This program includes a pre- launch review, an operational baseline audit, and an annual comprehensive national security audit. If at any time there is suspicion or concern that a system is being operated unlawfully, a no-notice inspection may be initiated. setbacks, three U.S. companies are now operational, with more firms expected to become so in the future. While NOAA does not disclose specific systems capabilities for proprietary reasons, its current licensing resolution thresholds for general commercial availability are as follows: 0.5 meter Ground Sample Distance (GSD) for panchromatic systems, 2 meter GSD for multi-spectral systems, 3 meter Impulse Response (IPR) for Synthetic Aperture Radar systems, and 20 meter GSD for hyperspectral systems (with certain 8-meter hyperspectral derived products also licensed for commercial distribution). These thresholds are subject to change based upon foreign availability and other considerations. It should also be noted that license applications are reviewed and granted on a case-by-case basis, pursuant to each system's technology and concept of operations. In 2001, NOAA, along with the Department of Commerce's International Trade Administration, commissioned a study by the RAND Corporation to assess the risks faced by the U.S. commercial remote sensing satellite industry. In commissioning this study, NOAA's goal was to bette

Stryker, Timothy


Bearing capacity analysis using the method of characteristics  

NASA Astrophysics Data System (ADS)

Using the method of characteristics, the bearing capacity for a strip footing is analyzed. The method of characteristics leads to an exact true limit load when the calculations of the three terms in the bearing capacity formula are consistent with one collapse mechanism and the soil satisfies the associated flow rule. At the same time, the method of characteristics avoids the assumption of arbitrary slip surfaces, and produces zones within which equilibrium and plastic yield are simultaneously satisfied for given boundary stresses. The exact solution without superposition approximation can still be expressed by Terzaghi's equation of bearing capacity, in which the bearing capacity factor N ?? is dependent on the dimensionless parameter ? and the friction angle ?. The influence of groundwater on the bearing capacity of the shallow strip footing is considered, which indicates that when the groundwater effect is taken into account, the error induced by the superposition approximation can be reduced as compared with dry soil condition. The results are presented in the form of charts which give the modified value (N_{? ^{? _c } }^W /N_{? ^{? _c } } ) of bearing capacity factor. Finally, an approximated analytical expression, which provides results in close agreement with those obtained by numerical analysis in this paper, has been suggested for practical application purposes.

Sun, Jian-Ping; Zhao, Zhi-Ye; Cheng, Yi-Pik



A qualitative analysis of immigrant population health practices in the Girona Healthcare Region  

PubMed Central

Background The research we present here forms part of a two-phase project - one quantitative and the other qualitative - assessing the use of primary health care services. This paper presents the qualitative phase of said research, which is aimed at ascertaining the needs, beliefs, barriers to access and health practices of the immigrant population in comparison with the native population, as well as the perceptions of healthcare professionals. Moroccan and sub-Saharan were the immigrants to who the qualitative phase was specifically addressed. The aims of this paper are as follows: to analyse any possible implications of family organisation in the health practices of the immigrant population; to ascertain social practices relating to illness; to understand the significances of sexual and reproductive health practices; and to ascertain the ideas and perceptions of immigrants, local people and professionals regarding health and the health system. Methods Qualitative research based on discursive analysis. Data gathering techniques consisted of discussion groups with health system users and semi-structured individual interviews with healthcare professionals. The sample was taken from the Basic Healthcare Areas of Salt and Banyoles (belonging to the Girona Healthcare Region), the discussion groups being comprised of (a) 6 immigrant Moroccan women, (b) 7 immigrant sub-Saharan African women and (c) 6 immigrant and native population men (2 native men, 2 Moroccan men and 2 sub-Saharan men); and the semi-structured interviews being conducted with the following healthcare professionals: (a) 3 gynaecologists, (b) 3 nurses and 1 administrative staff. Results Use of the healthcare system is linked to the perception of not being well, knowledge of the healthcare system, length of time resident in Spain and interiorization of traditional Western medicine as a cure mechanism. The divergences found among the groups of immigrants, local people and healthcare professionals with regard to healthcare education, use of the healthcare service, sexual and reproductive healthcare and reticence with regard to being attended by healthcare personnel of the opposite sex demonstrate a need to work with the immigrant population as a heterogeneous group. Conclusions The results we have obtained support the idea that feeling unwell is a psycho-social process, as it takes place within a specific socio-cultural situation and spans a range of beliefs, perceptions and ideas regarding symptomology and how to treat it. PMID:20587020



Comprehensive cosmographic analysis by Markov chain method  

NASA Astrophysics Data System (ADS)

We study the possibility of extracting model independent information about the dynamics of the Universe by using cosmography. We intend to explore it systematically, to learn about its limitations and its real possibilities. Here we are sticking to the series expansion approach on which cosmography is based. We apply it to different data sets: Supernovae type Ia (SNeIa), Hubble parameter extracted from differential galaxy ages, gamma ray bursts, and the baryon acoustic oscillations data. We go beyond past results in the literature extending the series expansion up to the fourth order in the scale factor, which implies the analysis of the deceleration q0, the jerk j0, and the snap s0. We use the Markov chain Monte Carlo method (MCMC) to analyze the data statistically. We also try to relate direct results from cosmography to dark energy (DE) dynamical models parametrized by the Chevallier-Polarski-Linder model, extracting clues about the matter content and the dark energy parameters. The main results are: (a) even if relying on a mathematical approximate assumption such as the scale factor series expansion in terms of time, cosmography can be extremely useful in assessing dynamical properties of the Universe; (b) the deceleration parameter clearly confirms the present acceleration phase; (c) the MCMC method can help giving narrower constraints in parameter estimation, in particular for higher order cosmographic parameters (the jerk and the snap), with respect to the literature; and (d) both the estimation of the jerk and the DE parameters reflect the possibility of a deviation from the ?CDM cosmological model.

Capozziello, S.; Lazkoz, R.; Salzano, V.



A practical approach to object based requirements analysis  

NASA Technical Reports Server (NTRS)

Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.

Drew, Daniel W.; Bishop, Michael



Face Recognition Technique Using Symbolic Linear Discriminant Analysis Method  

Microsoft Academic Search

Techniques that can introduce low dimensional feature representation with enhanced discriminatory power are important in face recognition systems. This paper presents one of the symbolic factor analysis method i.e., symbolic Linear Discriminant Analysis (symbolic LDA) method for face representation and recognition. Classical factor analysis methods extract features, which are single valued in nature to represent face images. These single valued

P. S. Hiremath; C. J. Prabhakar



Visual cluster analysis and pattern recognition methods  


A method of clustering using a novel template to define a region of influence. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques.

Osbourn, Gordon Cecil (Albuquerque, NM); Martinez, Rubel Francisco (Albuquerque, NM)



Visceral fat estimation method by bioelectrical impedance analysis and causal analysis  

NASA Astrophysics Data System (ADS)

It has been clarified that abdominal visceral fat accumulation is closely associated to the lifestyle disease and metabolic syndrome. The gold standard in medical fields is visceral fat area measured by an X-ray computer tomography (CT) scan or magnetic resonance imaging. However, their measurements are high invasive and high cost; especially a CT scan causes X-ray exposure. They are the reasons why medical fields need an instrument for viscera fat measurement with low invasive, ease of use, and low cost. The article proposes a simple and practical method of visceral fat estimation by employing bioelectrical impedance analysis and causal analysis. In the method, abdominal shape and dual impedances of abdominal surface and body total are measured to estimate a visceral fat area based on the cause-effect structure. The structure is designed according to the nature of abdominal body composition to be fine-tuned by statistical analysis. The experiments were conducted to investigate the proposed model. 180 subjects were hired to be measured by both a CT scan and the proposed method. The acquired model explained the measurement principle well and the correlation coefficient is 0.88 with the CT scan measurements.

Nakajima, Hiroshi; Tasaki, Hiroshi; Tsuchiya, Naoki; Hamaguchi, Takehiro; Shiga, Toshikazu



A Practical Method for Multi-Objective Scheduling through Soft Computing Approach  

NASA Astrophysics Data System (ADS)

Due to diversified customer demands and global competition, scheduling has been increasingly notified as an important problem-solving in manufacturing. Since the scheduling is considered at stage close to the practical operation in production planning, flexibility and agility in decision making should be most important in real world applications. In addition, since the final goal of such scheduling has many attributes, and their relative importance is likely changed depending on the decision environment, it is of great significance to derive a flexible scheduling through plain multi-objective optimization method. To derive such a rational scheduling, in this paper, we have applied a novel multi-objective optimization named MOON2R (MOON2 of radial basis function) by incorporating with simulated annealing as a solution algorithm. Finally, illustrative examples are provided to outline and verify the effectiveness of the proposed method.

Shimizu, Yoshiaki; Tanaka, Yasutsugu


A method for obtaining practical flutter-suppression control laws using results of optimal control theory  

NASA Technical Reports Server (NTRS)

The results of optimal control theory are used to synthesize a feedback filter. The feedback filter is used to force the output of the filtered frequency response to match that of a desired optimal frequency response over a finite frequency range. This matching is accomplished by employing a nonlinear programing algorithm to search for the coefficients of the feedback filter that minimize the error between the optimal frequency response and the filtered frequency response. The method is applied to the synthesis of an active flutter-suppression control law for an aeroelastic wind-tunnel model. It is shown that the resulting control law suppresses flutter over a wide range of subsonic Mach numbers. This is a promising method for synthesizing practical control laws using the results of optimal control theory.

Newson, J. R.



Assessing Scientific Practices Using Machine-Learning Methods: How Closely Do They Match Clinical Interview Performance?  

NASA Astrophysics Data System (ADS)

The landscape of science education is being transformed by the new Framework for Science Education (National Research Council, A framework for K-12 science education: practices, crosscutting concepts, and core ideas. The National Academies Press, Washington, DC, 2012), which emphasizes the centrality of scientific practices—such as explanation, argumentation, and communication—in science teaching, learning, and assessment. A major challenge facing the field of science education is developing assessment tools that are capable of validly and efficiently evaluating these practices. Our study examined the efficacy of a free, open-source machine-learning tool for evaluating the quality of students' written explanations of the causes of evolutionary change relative to three other approaches: (1) human-scored written explanations, (2) a multiple-choice test, and (3) clinical oral interviews. A large sample of undergraduates (n = 104) exposed to varying amounts of evolution content completed all three assessments: a clinical oral interview, a written open-response assessment, and a multiple-choice test. Rasch analysis was used to compute linear person measures and linear item measures on a single logit scale. We found that the multiple-choice test displayed poor person and item fit (mean square outfit >1.3), while both oral interview measures and computer-generated written response measures exhibited acceptable fit (average mean square outfit for interview: person 0.97, item 0.97; computer: person 1.03, item 1.06). Multiple-choice test measures were more weakly associated with interview measures (r = 0.35) than the computer-scored explanation measures (r = 0.63). Overall, Rasch analysis indicated that computer-scored written explanation measures (1) have the strongest correspondence to oral interview measures; (2) are capable of capturing students' normative scientific and naive ideas as accurately as human-scored explanations, and (3) more validly detect understanding than the multiple-choice assessment. These findings demonstrate the great potential of machine-learning tools for assessing key scientific practices highlighted in the new Framework for Science Education.

Beggrow, Elizabeth P.; Ha, Minsu; Nehm, Ross H.; Pearl, Dennis; Boone, William J.



Method Development for Analysis of Aspirin Tablets.  

ERIC Educational Resources Information Center

Develops a lab experiment for introductory instrumental analysis that requires interference studies and optimizing of conditions. Notes the analysis of the aspirin is by visible spectrophotometric assay. Gives experimental details and discussion. (MVL)

Street, Kenneth W., Jr.



Methods of Phylogenetic Analysis: New Improvements on Old Methods.  

E-print Network

is to estimate the evolutionary relationships between a set of homologous taxa, which can be anything from, because evolutionary history can never be known with certainty. The purpose of any phylogenetic analysis represent the topological relationship between the nodes (reviewed in Saitou, 1996). While many different


Delivering stepped care: an analysis of implementation in routine practice  

PubMed Central

Background In the United Kingdom, clinical guidelines recommend that services for depression and anxiety should be structured around a stepped care model, where patients receive treatment at different 'steps,' with the intensity of treatment (i.e., the amount and type) increasing at each step if they fail to benefit at previous steps. There are very limited data available on the implementation of this model, particularly on the intensity of psychological treatment at each step. Our objective was to describe patient pathways through stepped care services and the impact of this on patient flow and management. Methods We recorded service design features of four National Health Service sites implementing stepped care (e.g., the types of treatments available and their links with other treatments), together with the actual treatments received by individual patients and their transitions between different treatment steps. We computed the proportions of patients accessing, receiving, and transiting between the various steps and mapped these proportions visually to illustrate patient movement. Results We collected throughput data on 7,698 patients referred. Patient pathways were highly complex and very variable within and between sites. The ratio of low (e.g., self-help) to high-intensity (e.g., cognitive behaviour therapy) treatments delivered varied between sites from 22:1, through 2.1:1, 1.4:1 to 0.5:1. The numbers of patients allocated directly to high-intensity treatment varied from 3% to 45%. Rates of stepping up from low-intensity treatment to high-intensity treatment were less than 10%. Conclusions When services attempt to implement the recommendation for stepped care in the National Institute for Health and Clinical Excellence guidelines, there were significant differences in implementation and consequent high levels of variation in patient pathways. Evaluations driven by the principles of implementation science (such as targeted planning, defined implementation strategies, and clear activity specification around service organisation) are required to improve evidence on the most effective, efficient, and acceptable stepped care systems. PMID:22248385



Practical estimates of field-saturated hydraulic conductivity of bedrock outcrops using a modified bottomless bucket method  

NASA Astrophysics Data System (ADS)

The bottomless bucket (BB) approach (Nimmo et al., 2009a) is a cost-effective method for rapidly characterizing field-saturated hydraulic conductivityKfsof soils and alluvial deposits. This practical approach is of particular value for quantifying infiltration rates in remote areas with limited accessibility. A similar approach for bedrock outcrops is also of great value for improving quantitative understanding of infiltration and recharge in rugged terrain. We develop a simple modification to the BB method for application to bedrock outcrops, which uses a nontoxic, quick-drying silicone gel to seal the BB to the bedrock. These modifications to the field method require only minor changes to the analytical solution for calculatingKfs on soils. We investigate the reproducibility of the method with laboratory experiments on a previously studied calcarenite rock and conduct a sensitivity analysis to quantify uncertainty in our predictions. We apply the BB method on both bedrock and soil for sites on Pahute Mesa, which is located in a remote area of the Nevada National Security Site. The bedrock BB tests may require monitoring over several hours to days, depending on infiltration rates, which necessitates a cover to prevent evaporative losses. Our field and laboratory results compare well to Kfs values inferred from independent reports, which suggests the modified BB method can provide useful estimates and facilitate simple hypothesis testing. The ease with which the bedrock BB method can be deployed should facilitate more rapid in situ data collection than is possible with alternative methods for quantitative characterization of infiltration into bedrock.

Mirus, Benjamin B.; Perkins, Kim S.



Practical Estimates of Field-Saturated Hydraulic Conductivity of Bedrock Outcrops using a Modified Bottomless Bucket Method  

NASA Astrophysics Data System (ADS)

The bottomless bucket (BB) approach (Nimmo et al., VZJ, 2009) is a cost-effective method for rapidly characterizing field-saturated hydraulic conductivity Kfs of soils and alluvial deposits. This practical approach is of particular value for quantifying infiltration rates in remote areas with limited accessibility. A similar approach for bedrock outcrops is also of great value for improving quantitative understanding of infiltration and recharge in rugged terrain. We develop a simple modification to the BB method for application to bedrock outcrops, which uses a non-toxic, quick-drying silicone gel to seal the BB to the bedrock. These modifications to the field method require only minor changes to the analytical solution for calculating Kfs on soils. We investigate the reproducibility of the method with laboratory experiments on a previously studied calcarenite rock and conduct a sensitivity analysis to quantify uncertainty in our predictions. We apply the BB method on both bedrock and soil for sites on Pahute Mesa, which is located in a remote area of the Nevada National Security Site. The bedrock BB tests may require monitoring over several hours to days, depending on infiltration rates, which necessitates a cover to prevent evaporative losses. Our field and laboratory results compare well to Kfs values inferred from independent reports, which suggests the modified BB method can provide useful estimates and facilitate simple hypothesis testing. The ease with which the bedrock BB method can be deployed should facilitate more rapid in-situ data collection than is possible with alternative methods for quantitative characterization of infiltration into bedrock. Typical deployment of bedrock bottomless buckets (BBB's) on an outcrop of volcanic tuff before the application of water.

Mirus, B. B.; Perkins, K. S.



Chromium speciation by different methods of practical use for routine in situ measurement  

NASA Astrophysics Data System (ADS)

Simple, sensitive, low-cost, and relatively rapid methods for the detection of Cr (111) and Cr (VI) species in natural waters are needed for monitoring and regulatory purposes. Conventional acidification and storage of filtered samples can be a major cause of chromium losses from the `dissolved' phase. In situ monitoring is thus of paramount importance. The practical usefulness of selected chromium speciation methods was assessed in the laboratory and in the field. Significant discrepancies were found in the Cr (VI) detection efficiency by a selective ion meter based on the diphenylcarbazide method when compared with conventional Zeeman graphite fumace AAS. The efficiency of the DGT (Diffusion gradients in thin films) method, based on the deployment in situ of gel/resin units capable of separating labile species of Cr (III) and Cr (VI), looks promising, but is limited by cost considerations and by potential complications in the presence of complexing substances. The method based on the Sephadex DEAE A-25 ion exchange resins is quite effective in the separation of Cr species, though it requires on-site facilities, is relatively time-consuming and is potentially affected by complexing substances.

Barakat, S.; Giusti, L.



Methods of Motor Current Signature Analysis  

Microsoft Academic Search

Recently a technique for monitoring and diagnosing mechanical problems, associated with rotating machines driven by electric motors, has been proposed and is now being offered by several commercial suppliers. This technique, known as “Motor Current Signature Analysis” or MCSA, seeks to apply much of the long experience in vibration signature analysis to the analysis of motor current in effect using

G. B. Kliman; J. Stein



Thermal Analysis Methods For Earth Entry Vehicle  

NASA Technical Reports Server (NTRS)

Thermal analysis of a vehicle designed to return samples from another planet, such as the Earth Entry vehicle for the Mars Sample Return mission, presents several unique challenges. The Earth Entry Vehicle (EEV) must contain Martian material samples after they have been collected and protect them from the high heating rates of entry into the Earth's atmosphere. This requirement necessitates inclusion of detailed thermal analysis early in the design of the vehicle. This paper will describe the challenges and solutions for a preliminary thermal analysis of an Earth Entry Vehicle. The aeroheating on the vehicle during entry would be the main driver for the thermal behavior, and is a complex function of time, spatial position on the vehicle, vehicle temperature, and trajectory parameters. Thus, the thermal analysis must be closely tied to the aeroheating analysis in order to make accurate predictions. Also, the thermal analysis must account for the material response of the ablative thermal protection system (TPS). For the exo-atmospheric portion of the mission, the thermal analysis must include the orbital radiation fluxes on the surfaces. The thermal behavior must also be used to predict the structural response of the vehicle (the thermal stress and strains) and whether they remain within the capability of the materials. Thus, the thermal analysis requires ties to the three-dimensional geometry, the aeroheating analysis, the material response analysis, the orbital analysis, and the structural analysis. The goal of this paper is to describe to what degree that has been achieved.

Amundsen, Ruth M.; Dec, John A.; Lindell, Michael C.



Analysis of runoff change trend using hydrological time series method  

Microsoft Academic Search

This paper gives an introduction to a number of hydrological time series methods, including run test method, Mann-Kendall statistical testing method, cumulative anomaly method and cloud model. These methods are applied in the runoff change trend analysis in Yichang station, which is located at the main stream of the upper reache of Yangtze River. The result shows that, Mann-Kendall method

Tian Yu; Ma Liya; Lei Xiaohui; Jiang Yunzhong



[Place of reflexotherapy and some other methods of alternative medicine in modern medical practice].  


Assessment of the role and place of nontraditional methods of treatment and reflexotherapy, widely applied in hospitals is presented in the article. Besides, we become alerted regarding not serious approach of some scientists and health service managers to reflexotherapy as a whole and to one of its methods--acupuncture. An analysis of the situation developed in the legislation concerning training of reflexotherapy specialists for last 15-20 years not only in Ukraine, but also abroad was done. The article presents a historical parallel between the use of medicamentous and nonmedicamentous methods of treatment. PMID:20608024

Bo?chak, M P; Sobetski?, V V



Practice Makes Perfect: Improving Students' Skills in Understanding and Avoiding Plagiarism with a Themed Methods Course  

ERIC Educational Resources Information Center

To address the issue of plagiarism, students in two undergraduate Research Methods and Analysis courses conducted, analyzed, and wrote up original research on the topic of plagiarism. Students in an otherwise identical course completed the same assignments but examined a different research topic. At the start and end of the semester, all students…

Estow, Sarah; Lawrence, Eva K.; Adams, Kathrynn A.



A Practical Look at the Lack of Cohesion in Methods Metric Letha Etzkorn, Carl Davis, and Wei Li  

E-print Network

A Practical Look at the Lack of Cohesion in Methods Metric Letha Etzkorn, Carl Davis, and Wei Li development paradigm have been extensively studied. Metrics such as McCabe's cyclomatic complexity metric1

Etzkorn, Letha Hughes


Pre-service elementary science teaching self-efficacy and teaching practices: A mixed-methods, dual-phase, embedded case study  

NASA Astrophysics Data System (ADS)

This mixed-method, dual-phase, embedded-case study employed the Social Cognitive Theory and the construct of self-efficacy to examine the contributors to science teaching self-efficacy and science teaching practices across different levels of efficacy in six pre-service elementary teachers during their science methods course and student teaching experiences. Data sources included the Science Teaching Efficacy Belief Instrument (STEBI-B) for pre-service teachers, questionnaires, journals, reflections, student teaching lesson observations, and lesson debriefing notes. Results from the STEBI-B show that all participants measured an increase in efficacy throughout the study. The ANOVA analysis of the STEBI-B revealed a statistically significant increase in level of efficacy during methods course, student teaching, and from the beginning of the study to the end. Of interest in this study was the examination of the participants' science teaching practices across different levels of efficacy. Results of this analysis revealed how the pre-service elementary teachers in this study contextualized their experiences in learning to teach science and its influences on their science teaching practices. Key implications involves the value in exploring how pre-service teachers interpret their learning to teach experiences and how their interpretations influence the development of their science teaching practices.

Sangueza, Cheryl Ramirez


Using Semantic Workflows to Disseminate Best Practices and Accelerate Discoveries in Multi-Omic Data Analysis  

E-print Network

The goal of our work is to enable omics analysis to be easily contextualized and interpreted a framework where common omics analysis methods are easy to reuse, analytic results are reproducible workflows to capture multi-step omic analysis methods and annotate them with constraints that express

Gil, Yolanda


A practical method for estimating non-isothermal and formation damage skin factors for cold water injection wells  

E-print Network


Warland, Arild



Design of a practical model-observer-based image quality assessment method for CT imaging systems  

NASA Astrophysics Data System (ADS)

The channelized Hotelling observer (CHO) is a powerful method for quantitative image quality evaluations of CT systems and their image reconstruction algorithms. It has recently been used to validate the dose reduction capability of iterative image-reconstruction algorithms implemented on CT imaging systems. The use of the CHO for routine and frequent system evaluations is desirable both for quality assurance evaluations as well as further system optimizations. The use of channels substantially reduces the amount of data required to achieve accurate estimates of observer performance. However, the number of scans required is still large even with the use of channels. This work explores different data reduction schemes and designs a new approach that requires only a few CT scans of a phantom. For this work, the leave-one-out likelihood (LOOL) method developed by Hoffbeck and Landgrebe is studied as an efficient method of estimating the covariance matrices needed to compute CHO performance. Three different kinds of approaches are included in the study: a conventional CHO estimation technique with a large sample size, a conventional technique with fewer samples, and the new LOOL-based approach with fewer samples. The mean value and standard deviation of area under ROC curve (AUC) is estimated by shuffle method. Both simulation and real data results indicate that an 80% data reduction can be achieved without loss of accuracy. This data reduction makes the proposed approach a practical tool for routine CT system assessment.

Tseng, Hsin-Wu; Fan, Jiahua; Cao, Guangzhi; Kupinski, Matthew A.; Sainath, Paavana



Undergraduate physiotherapy students' competencies, attitudes and perceptions after integrated educational pathways in evidence-based practice: a mixed methods study.  


Abstract This mixed methods study aimed to explore perceptions/attitudes, to evaluate knowledge/ skills, to investigate clinical behaviours of undergraduate physiotherapy students exposed to a composite education curriculum on evidence-based practice (EBP). Students' knowledge and skills were assessed before and after integrated learning activities, using the Adapted Fresno test, whereas their behaviour in EBP was evaluated by examining their internship documentation. Students' perceptions and attitudes were explored through four focus groups. Sixty-two students agreed to participate in the study. The within group mean differences (A-Fresno test) were 34.2 (95% CI 24.4 to 43.9) in the first year and 35.1 (95% CI 23.2 to 47.1) in the second year; no statistically significant change was observed in the third year. Seventy-six percent of the second year and 88% of the third year students reached the pass score. Internship documentation gave evidence of PICOs and database searches (95-100%), critical appraisal of internal validity (25-75%) but not of external validity (5-15%). The correct application of these items ranged from 30 to 100%. Qualitative analysis of the focus groups indicated students valued EBP, but perceived many barriers, with clinicians being both an obstacle and a model. Key elements for changing students' behaviours seem to be internship environment and possibility of continuous practice and feedback. PMID:24766584

Bozzolan, M; Simoni, G; Balboni, M; Fiorini, F; Bombardi, S; Bertin, N; Da Roit, M



Perceptions and Attitudes of Medical Students towards Two Methods of Assessing Practical Anatomy Knowledge  

PubMed Central

Objectives: Traditionally, summative practical examination in anatomy takes the form of ‘spotters’ consisting of a stream of prosections, radiological images and dissections with pins indicating specific structures. Recently, we have started to administer similar examinations online using the quiz facility in Moodle™ (a free, open-source web application for producing modular internet-based courses) in addition to the traditional format. This paper reports on an investigation into students’ perceptions of each assessment environment. Methods: Over a 3-year period, practical assessment in anatomy was conducted either in traditional format or online via learning management software called Moodle™. All students exposed to the two examination formats at the College of Medicine & Health Sciences, Sultan Qaboos University, Oman, were divided into two categories: junior (Year 3) and senior (Year 4). An evaluation of their perception of both examination formats was conducted using a self-administered questionnaire consisting of restricted and free response items. Results: More than half of all students expressed a clear preference for the online environment and believed it was more exam-friendly. This preference was higher amongst senior students. Compared to females, male students preferred the online environment. Senior students were less likely to study on cadavers when the examination was conducted online. Specimen quality, ability to manage time, and seating arrangements were major advantages identified by students who preferred the online format. Conclusion: Computer-based practical examinations in anatomy appeared to be generally popular with our students. The students adopted a different approach to study when the exam was conducted online as compared to the traditional ‘steeplechase’ format. PMID:22087381

Inuwa, Ibrahim M; Taranikanti, Varna; Al-Rawahy, Maimouna; Habbal, Omar



Second harmonic generation by micropowders: a revision of the Kurtz-Perry method and its practical application  

NASA Astrophysics Data System (ADS)

We theoretically study the second harmonic generation by powder crystal monolayers and by thick samples of crystalline powder with particle size in the range of microns. Contrary to usual treatments, the light scattering by the particles is explicitly introduced in the model. The cases of powder in air and in an index-matching liquid under the most common experimental geometries are considered. Special attention is paid to the possibility of determining the value of some nonlinear optical coefficients from the experiments. The limitations and shortcomings of the classical Kurtz and Perry method (Kurtz and Perry in J Appl Phys 39:3798, 1968) and the most common practical misuses of it are discussed. It is argued that many of the experimental works based on that method oversimplify the technique and contain important errors. In order to obtain reliable values of the nonlinear coefficients, an appropriate experimental configuration and analysis of the data are pointed out. The analysis is especially simple in the case of uniaxial phase-matchable materials for which simple analytical expressions are derived.

Aramburu, I.; Ortega, J.; Folcia, C. L.; Etxebarria, J.



A practical review of methods for measuring the dynamic characteristics of industrial pressure transmitters.  


Three methods exist for testing the response times of pressure transmitters in situ: the power interrupt test, the noise analysis technique, and the pink noise technique. The noise (or random signal) analysis technique is a passive in situ technique that does not interfere with plant operation, uses already existing plant sensors and instrumentation, accounts for the effects of process conditions on plant equipment performance, and includes any response-time delays caused by transmitter sensing lines. The power interrupt test is a simpler and less-time-consuming test than noise analysis for measuring the response time of force-balance pressure transmitters. The pink noise test is useful for pressure transmitters where process fluctuations do not normally exist or they are inadequate for using the noise analysis technique to test response time. PMID:19854438

Hashemian, H M; Jiang, Jin



Skinner Meets Piaget on the Reggio Playground: Practical Synthesis of Applied Behavior Analysis and Developmentally Appropriate Practice Orientations  

ERIC Educational Resources Information Center

We focus on integrating developmentally appropriate practices, the project approach of Reggio Emilia, and a behavior analytic model to support a quality preschool environment. While the above practices often are considered incompatible, we have found substantial overlap and room for integration of these perspectives in practical application. With…

Warash, Bobbie; Curtis, Reagan; Hursh, Dan; Tucci, Vicci



Skinner Meets Piaget on the Reggio Playground: Practical Synthesis of Applied Behavior Analysis and Developmentally Appropriate Practice Orientations  

Microsoft Academic Search

We focus on integrating developmentally appropriate practices, the project approach of Reggio Emilia, and a behavior analytic model to support a quality preschool environment. While the above practices often are considered incompatible, we have found substantial overlap and room for integration of these perspectives in practical application. With the growing number of children with disabilities and challenging behaviors in regular

Bobbie Warash; Reagan Curtis; Dan Hursh; Vicci Tucci



Non-communicable diseases and implications for medical practice in Australia: a framework for analysis.  


Non-communicable diseases (NCDs) have become leading causes of mortality and morbidity as part of historical epidemiological, demographic and nutritional transitions. There has been considerable historical analysis of the immediate and underlying causes of this change in the impacts of communicable diseases and NCDs, but far less historical analysis of how this transition has shaped medical practice. We lay out a framework for future historical analysis by proposing four domains of inquiry into key areas of change: changes in the concept of disease; evolution of medical technology; changes in workforce, including variation in roles and emerging areas of specialisation; and changes in health care structures including models of care, government responses and transitioning health systems. Our aim is to encourage analysis that takes into account key features in each of the four domains, thus enabling a more complete understanding of why, how and under what circumstances NCDs have had an effect on medical practice. PMID:25047773

McNab, Justin; Huckel Schneider, Carmen; Leeder, Stephen



A Quantitative Analysis and Natural History of B. F. Skinner's Coauthoring Practices  

PubMed Central

This paper describes and analyzes B. F. Skinner's coauthoring practices. After identifying his 35 coauthored publications and 27 coauthors, we analyze his coauthored works by their form (e.g., journal articles) and kind (e.g., empirical); identify the journals in which he published and their type (e.g., data-type); describe his overall and local rates of publishing with his coauthors (e.g., noting breaks in the latter); and compare his coauthoring practices with his single-authoring practices (e.g., form, kind, journal type) and with those in the scientometric literature (e.g., majority of coauthored publications are empirical). We address these findings in the context of describing the natural history of Skinner's coauthoring practices. Finally, we describe some limitations in our methods and offer suggestions for future research. PMID:22532732

McKerchar, Todd L; Morris, Edward K; Smith, Nathaniel G



A Quantitative Analysis and Natural History of B. F. Skinner's Coauthoring Practices.  


This paper describes and analyzes B. F. Skinner's coauthoring practices. After identifying his 35 coauthored publications and 27 coauthors, we analyze his coauthored works by their form (e.g., journal articles) and kind (e.g., empirical); identify the journals in which he published and their type (e.g., data-type); describe his overall and local rates of publishing with his coauthors (e.g., noting breaks in the latter); and compare his coauthoring practices with his single-authoring practices (e.g., form, kind, journal type) and with those in the scientometric literature (e.g., majority of coauthored publications are empirical). We address these findings in the context of describing the natural history of Skinner's coauthoring practices. Finally, we describe some limitations in our methods and offer suggestions for future research. PMID:22532732

McKerchar, Todd L; Morris, Edward K; Smith, Nathaniel G



An Analysis of Texas Superintendents' Bilingual/ESL Teacher Recruitment and Retention Practices  

E-print Network

Bilingual/ESL Teacher Recruitment/Retention 1 Running Head: BILINGUAL/ESL TEACHER RECRUITMENT/RETENTION An Analysis of Texas Superintendents? Bilingual/ESL Teacher Recruitment and Retention Practices Rafael Lara-Alecio, Ph... Huntsville, Texas 77341 A paper presented at the Annual Meeting of the American Educational Research Association, San Diego, CA, April 13, 2004. Bilingual/ESL Teacher Recruitment/Retention 2 An Analysis of Texas Superintendents? Bilingual...

Lara-Alecio, Rafael; Galloway, Martha; Irby, Beverly J.; Brown, Genevieve



A Longitudinal Analysis of Parenting Practices, Couple Satisfaction, and Child Behavior Problems  

Microsoft Academic Search

This longitudinal study examined the relationship between couple relationship satisfaction, parenting practices, parent depression, and child problem behaviors. The study partici- pants (n = 148) were part of a larger experimental study that examined the effectiveness of a brief family-centered intervention, the Family Check-Up model. Regression analysis results indicated that our proposed model accounted for 38% of the variance in

Deanna Linville; Krista Chronister; Tom Dishion; Jeff Todahl; John Miller; Daniel Shaw; Francis Gardner; Melvin Wilson



Issues management and organizational accounts: An analysis of corporate responses to accusations of unethical business practices  

Microsoft Academic Search

When external groups accuse a business organization of unethical practices, managers of the accused organization usually offer a communicative response to attempt to protect their organization's public image. Even though many researchers readily concur that analysis of these communicative responses is important to our understanding of business and society conflict, few investigations have focused on developing a theoretical framework for

Dennis E. Garrett; Jeffrey L. Bradford; Renee A. Meyers; Joy Becker



Australian corporate environmental reporting: a comparative analysis of disclosure practices across voluntary and mandatory disclosure systems  

Microsoft Academic Search

Purpose – This paper extends the literature in the environmental disclosure area by examining annual report disclosure practices of Australian companies within the combined voluntary and mandatory environmental disclosure system. Design\\/methodology\\/approach – Content analysis was used to investigate the environmental disclosures over three consecutive years in the annual reports of companies that would be subject to environmental regulation and\\/or perceived

Stacey Cowan



The State, Legal Rigor and the Poor: The Daily Practice of Welfare Control Social Analysis, forthcoming.  

E-print Network

1 The State, Legal Rigor and the Poor: The Daily Practice of Welfare Control Social Analysis on ethnographic fieldwork on the control of welfare recipients by public welfare agencies in France. I consider investigations conducted in the homes of welfare recipients as a form of bureaucratic interrogation

Paris-Sud XI, Université de


A Secondary Analysis of the Impact of School Management Practices on School Performance  

ERIC Educational Resources Information Center

The purpose of this study was to conduct a secondary analysis of the impact of school management practices on school performance utilizing a survey design of School and Staffing (SASS) data collected by the National Center for Education Statistics (NCES) of the U.S. Department of Education, 1999-2000. The study identifies those school management…

Talbert, Dale A.



Using Performance Analysis for Training in an Organization Implementing ISO-9000 Manufacturing Practices: A Case Study.  

ERIC Educational Resources Information Center

This case study examines the application of the Performance Analysis for Training (PAT) Model in an organization that was implementing ISO-9000 (International Standards Organization) processes for manufacturing practices. Discusses the interaction of organization characteristics, decision maker characteristics, and analyst characteristics to…

Kunneman, Dale E.; Sleezer, Catherine M.



AAMFT Master Series Tapes: An Analysis of the Inclusion of Feminist Principles into Family Therapy Practice.  

ERIC Educational Resources Information Center

Content analysis of 23 American Association for Marriage and Family Therapy Master Series tapes was used to determine how well feminist behaviors have been incorporated into ideal family therapy practice. Feminist behaviors were infrequent, being evident in fewer than 3% of time blocks in event sampling and 10 of 39 feminist behaviors of the…

Haddock, Shelley A.; MacPhee, David; Zimmerman, Toni Schindler



Integration of Pharmacy Practice and Pharmaceutical Analysis: Quality Assessment of Laboratory Performance.  

ERIC Educational Resources Information Center

Laboratory portions of courses in pharmacy practice and pharmaceutical analysis at the Medical University of South Carolina are integrated and coordinated to provide feedback on student performance in compounding medications. Students analyze the products they prepare, with early exposure to compendia requirements and other references. Student…

McGill, Julian E.; Holly, Deborah R.




SciTech Connect

Recent evaluations of neutron cross section covariances in the resolved resonance region reveal the need for further research in this area. Major issues include declining uncertainties in multigroup representations and proper treatment of scattering radius uncertainty. To address these issues, the present work introduces a practical method based on kernel approximation using resonance parameter uncertainties from the Atlas of Neutron Resonances. Analytical expressions derived for average cross sections in broader energy bins along with their sensitivities provide transparent tool for determining cross section uncertainties. The role of resonance-resonance and bin-bin correlations is specifically studied. As an example we apply this approach to estimate (n,{gamma}) and (n,el) covariances for the structural material {sup 55}Mn.

Cho, Y.S.; Oblozinsky, P.; Mughabghab,S.F.; Mattoon,C.M.; Herman,M.



The National Criminal Justice Treatment Practices survey: Multilevel survey methods and procedures?  

PubMed Central

The National Criminal Justice Treatment Practices (NCJTP) survey provides a comprehensive inquiry into the nature of programs and services provided to adult and juvenile offenders involved in the justice system in the United States. The multilevel survey design covers topics such as the mission and goals of correctional and treatment programs; organizational climate and culture for providing services; organizational capacity and needs; opinions of administrators and staff regarding rehabilitation, punishment, and services provided to offenders; treatment policies and procedures; and working relationships between correctional and other agencies. The methodology generates national estimates of the availability of programs and services for offenders. This article details the methodology and sampling frame for the NCJTP survey, response rates, and survey procedures. Prevalence estimates of juvenile and adult offenders under correctional control are provided with externally validated comparisons to illustrate the veracity of the methodology. Limitations of the survey methods are also discussed. PMID:17383548

Taxman, Faye S.; Young, Douglas W.; Wiersema, Brian; Rhodes, Anne; Mitchell, Suzanne



Who's in and why? A typology of stakeholder analysis methods for natural resource management.  


Stakeholder analysis means many things to different people. Various methods and approaches have been developed in different fields for different purposes, leading to confusion over the concept and practice of stakeholder analysis. This paper asks how and why stakeholder analysis should be conducted for participatory natural resource management research. This is achieved by reviewing the development of stakeholder analysis in business management, development and natural resource management. The normative and instrumental theoretical basis for stakeholder analysis is discussed, and a stakeholder analysis typology is proposed. This consists of methods for: i) identifying stakeholders; ii) differentiating between and categorising stakeholders; and iii) investigating relationships between stakeholders. The range of methods that can be used to carry out each type of analysis is reviewed. These methods and approaches are then illustrated through a series of case studies funded through the Rural Economy and Land Use (RELU) programme. These case studies show the wide range of participatory and non-participatory methods that can be used, and discuss some of the challenges and limitations of existing methods for stakeholder analysis. The case studies also propose new tools and combinations of methods that can more effectively identify and categorise stakeholders and help understand their inter-relationships. PMID:19231064

Reed, Mark S; Graves, Anil; Dandy, Norman; Posthumus, Helena; Hubacek, Klaus; Morris, Joe; Prell, Christina; Quinn, Claire H; Stringer, Lindsay C




EPA Science Inventory

Three different methods of analysis of panels were compared using asthma panel data from a 1970-1971 study done by EPA in Riverhead, New York. The methods were (1) regression analysis using raw attack rates; (2) regression analysis using the ratio of observed attacks to expected ...



E-print Network

MONTE CARLO ANALYSIS: ESTIMATING GPP WITH THE CANOPY CONDUCTANCE METHOD 1. Overview A novel method performed a Monte Carlo Analysis to investigate the power of our statistical approach: i.e. what and Assumptions The Monte Carlo Analysis was performed as follows: · Natural variation. The only study to date

DeLucia, Evan H.


Analysis of an inquiry-oriented inservice program in affecting science teaching practices  

NASA Astrophysics Data System (ADS)

This study was an examination of how science teachers' teaching abilities---content and pedagogical knowledge and skills---were affected by an inquiry-oriented science education professional development program. The study researched the characteristics of an inservice program, Microcosmos, designed to equip teachers with new perspectives on how to stimulate students' learning and to promote a self-reflective approach for the implementation of instructional practices leading to improving teachers' and students' roles in the science classroom. The Microcosmos Inservice Program, which focused on the use of microorganisms as a vehicle to teach science for middle and high school grades, was funded by the National Science Foundation and developed by the Microcosmos Project based at the School of Education, Boston University. The teacher-training program had as its main objective to show teachers and other educators how the smallest life forms---the microbes---can be a usable and dynamic way to stimulate science interest in students of all ages. It combines and integrates a number of training components that appear to be consistent with the recommendations listed in the major reform initiatives. The goal of the study was to explore weather the program provoked any change(s) in the pedagogical practices of teachers over time, and if these changes fostered inquiry-based practices in the classroom. The exploratory analysis used a qualitative methodology that followed a longitudinal design for the collection of the data gathered from a sample of 31 participants. The data was collected in two phases. Phase One - The Case History group, involved 5 science teachers over a period of seven years. Phase Two - The Expanded Teacher sample, involved 26 teachers---22 new teachers plus four teachers from Phase One---contacted at two different points on time during the study. Multiple data sources allowed for the collection of a varied and rigorous set of data for each individual in the sample. The primary data source was semi-structured interviews. Secondary data sources included pre- and post- on-site visits, classroom observations, teacher's self-report protocols and questionnaires, and documents and examples of teacher-work developed during the inservice training. The data was examined for evidence of change on: teachers' self-reported content-specific gains, teachers'self-reported and observed changes in their teaching methods and approach to curriculum, and the teachers' self-reported and observed changes in classroom practices as a result of the content and the pedagogy acting together and supplementing each other. A major finding of the study confirmed the benefits of inservice activities with an integral focus of science content and pedagogy on enhancing teachers' approach to instruction. The findings give renewed emphasis to the importance that inquiry-based practices for working with teachers, combined with a specific subject-matter focus, have in designing effective professional development. This combined approach, in some instances, contributed to important gains in the pedagogical content knowledge that teachers needed in order to effectively implement the Microcosmos learning experiences.

Santamaria Makang, Doris


A comparative analysis of image fusion methods  

Microsoft Academic Search

There are many image fusion methods that can be used to produce high-resolution multispectral images from a high-resolution panchromatic image and low-resolution multispectral images. Starting from the physical principle of image formation, this paper presents a comprehensive framework, the general image fusion {(GIF)} method, which makes it possible to categorize, compare, and evaluate the existing image fusion methods. Using the

Zhijun Wang; Djemel Ziou; Costas Armenakis; Deren Li; Qingquan Li



Physical methods for intracellular delivery: practical aspects from laboratory use to industrial-scale processing.  


Effective intracellular delivery is a significant impediment to research and therapeutic applications at all processing scales. Physical delivery methods have long demonstrated the ability to deliver cargo molecules directly to the cytoplasm or nucleus, and the mechanisms underlying the most common approaches (microinjection, electroporation, and sonoporation) have been extensively investigated. In this review, we discuss established approaches, as well as emerging techniques (magnetofection, optoinjection, and combined modalities). In addition to operating principles and implementation strategies, we address applicability and limitations of various in vitro, ex vivo, and in vivo platforms. Importantly, we perform critical assessments regarding (1) treatment efficacy with diverse cell types and delivered cargo molecules, (2) suitability to different processing scales (from single cell to large populations), (3) suitability for automation/integration with existing workflows, and (4) multiplexing potential and flexibility/adaptability to enable rapid changeover between treatments of varied cell types. Existing techniques typically fall short in one or more of these criteria; however, introduction of micro-/nanotechnology concepts, as well as synergistic coupling of complementary method(s), can improve performance and applicability of a particular approach, overcoming barriers to practical implementation. For this reason, we emphasize these strategies in examining recent advances in development of delivery systems. PMID:23813915

Meacham, J Mark; Durvasula, Kiranmai; Degertekin, F Levent; Fedorov, Andrei G



A Qualitative Analysis of an Advanced Practice Nurse-Directed Transitional Care Model Intervention  

PubMed Central

Purpose: The purpose of this study was to describe barriers and facilitators to implementing a transitional care intervention for cognitively impaired older adults and their caregivers lead by advanced practice nurses (APNs). Design and Methods: APNs implemented an evidence-based protocol to optimize transitions from hospital to home. An exploratory, qualitative directed content analysis examined 15 narrative case summaries written by APNs and fieldnotes from biweekly case conferences. Results: Three central themes emerged: patients and caregivers having the necessary information and knowledge, care coordination, and the caregiver experience. An additional category was also identified, APNs going above and beyond. Implications: APNs implemented individualized approaches and provided care that exceeds the type of care typically staffed and reimbursed in the American health care system by applying a Transitional Care Model, advanced clinical judgment, and doing whatever was necessary to prevent negative outcomes. Reimbursement reform as well as more formalized support systems and resources are necessary for APNs to consistently provide such care to patients and their caregivers during this vulnerable time of transition. PMID:21908805

Bradway, Christine; Trotta, Rebecca; Bixby, M.Brian; McPartland, Ellen; Wollman, M. Catherine; Kapustka, Heidi; McCauley, Kathleen; Naylor, Mary D.



The homotopy analysis method and the Lienard equation  

E-print Network

In this work, Lienard equations are considered. The limit cycles of these systems are studied by applying the homotopy analysis method. The amplitude and frequency obtained with this methodology are in good agreement with those calculated by computational methods. This puts in evidence that the homotopy analysis method is an useful tool to solve nonlinear differential equations.

Abbasbandy, Saied; Lopez-Ruiz, Ricardo



Symbolic analysis methods for averaged modeling of switching power converters  

Microsoft Academic Search

Symbolic analysis methods for the averaged modeling of switching power converters are presented in this paper. A general averaging method suitable for computer-aided modeling is discussed first. Then, a symbolic analysis package that uses this averaging method to automatically generate an analytical averaged model for a switching power converter is described. The package is implemented using the computer algebra system

Jian Sun; Horst Grotstollen




Microsoft Academic Search

In this paper a new auditorily motivated analysis method for room impulse responses is presented. The method applies same kind of time and frequency resolution than the human hearing. With the proposed method it is possible to study the decaying sound field of a room in more detail. It is applicable as well in the analysis of artificial reverberation and

Tapio Lokki; Matti Karjalainen


Game Practices and Educational Design: Applying an Ethnographic Analysis of Game Play to an Educational Design Problem  

Microsoft Academic Search

In this poster I use findings from an ethnographic analysis of young people's video gaming practices to approach an educational design problem. Three categories of practices are identified as a starting point for design principles informing a new virtual environment that simulates school practices. The intended purpose of the environment is to provide young people and their families with a

Tom Satwicz


Interactive Multicriteria Methods in Portfolio Decision Analysis  

Microsoft Academic Search

\\u000a Decision Analysis is a constructive, learning process. This is particularly true of Portfolio Decision Analysis (PDA) where\\u000a the number of elicitation judgements is typically very large and the alternatives under consideration are a combinatorial\\u000a set and so cannot be listed and examined explicitly. Consequently, PDA is to some extent an interactive process. In this chapter\\u000a we discuss what form that

Nikolaos Argyris; José Rui Figueira; Alec Morton


Standard method for toxic PCB congener analysis  

Microsoft Academic Search

The use of Toxic Equivalent Factors (TEFs) for dioxins and furans has recently expanded to include the co-planar PCBs which have similar toxicological properties. In the ad hoc application of PCB TEFs, researchers have utilized PCB concentration data obtained by a number of different methods. As the use of PCB TEFs expands into more formal applications, methods for these relatively




EPA Science Inventory

The report gives results of method evaluations for products of incomplete combustion (PICs): 36 proposed PICs were evaluated by previously developed gas chromatography/flame ionization detection (GC/FID) and gas chromatography/mass spectroscopy (GC/MS) methods. It also gives resu...



EPA Science Inventory

The research program surveyed and evaluated the methods and procedures used to identify and quantitate chemical constituents in human breath. Methods have been evaluated to determine their ease and rapidity, as well as cost, accuracy, and precision. During the evaluation, a secon...


A new method of heart sound signal analysis based on independent function element  

NASA Astrophysics Data System (ADS)

In this paper, a new method is presented for heart sound signal processing in statistical domain. The multiple components obtained from the conventional linear transformation are possibly irrelevant, but usually do not possess the characteristics of statistical independence. First, the definition and obtaining method of independent function element are discussed; the method of signal decomposition and reconstruction based on the independent function element, not only inherits the advantages of linear transformation, but also has the capability of signal representation in the statistical domain. After that, the application of independent function element in heart sound signal analysis is analyzed in detail. The validity and practicability of the method are demonstrated through two experiments.

Xie-feng, Cheng; Bin, Jiang; He, Yang; YuFeng, Guo; ShaoBai, Zhang



Meta-analysis methods for risk differences.  


The difference between two proportions, referred to as a risk difference, is a useful measure of effect size in studies where the response variable is dichotomous. Confidence interval methods based on a varying coefficient model are proposed for combining and comparing risk differences from multi-study between-subjects or within-subjects designs. The proposed methods are new alternatives to the popular constant coefficient and random coefficient methods. The proposed varying coefficient methods do not require the constant coefficient assumption of effect size homogeneity, nor do they require the random coefficient assumption that the risk differences from the selected studies represent a random sample from a normally distributed superpopulation of risk differences. The proposed varying coefficient methods are shown to have excellent finite-sample performance characteristics under realistic conditions. PMID:23962020

Bonett, Douglas G; Price, Robert M



Effects of 2 educational methods on the knowledge, attitude, and practice of women high school teachers in prevention of cervical cancer.  


Because of the increased emphasis on prevention and early detection of cervical cancer, we studied the effects of 2 educational methods on the knowledge, attitude, and practice, as regards prevention of cervical cancer, of women high school teachers in Tabriz. This study was a semiexperimental research. Samples were 129 female teachers divided in 3 groups: experimental 1 (educated by pamphlets), experimental 2 (educated by a lecture and flash cards), and control group (not manipulated). After doing pretest in the 3 groups, investigators used 2 educational methods for experimental groups. Data regarding the knowledge and attitude of 3 groups were gathered after 14 days and data regarding practice were gathered after 2 months. Chi-square and 1-way ANOVA were used for data analysis. Before education, knowledge, attitude, and practice of the 3 groups were the same, but after education there were significant differences in mean scores of knowledge and attitude of 2 experimental groups as compared with the control group and also between the 2 experimental groups (P < .001). Education by lecture and flash cards was more effective than by pamphlets. In regard to Pap smear practice, there was a significant difference between the 2 experimental groups as compared with the control group (P = .001), but there was no significant difference between the 2 experimental groups. Therefore, educational methods were effective on knowledge, attitude, and practice of teachers regarding prevention of cervical cancer and education by lecture and flash cards was more effective than by pamphlets in increasing knowledge and inducing a positive attitude but the 2 educational methods had the same effect on practice of teachers. PMID:15525863

Rezaei, Mahin Baradaran; Seydi, Simin; Alizadeh, Sakineh Mohammad




EPA Science Inventory

The analysis of potentially hazardous air, water and soil samples collected and shipped to service laboratories off-site is time consuming and expensive. This Chapter addresses the practical alternative of performing the requisite analytical services on-site. The most significant...


Linguistically Diverse Students and Special Education: A Mixed Methods Study of Teachers' Attitudes, Coursework, and Practice  

ERIC Educational Resources Information Center

While the number of linguistically diverse students (LDS) grows steadily in the U.S., schools, research and practice to support their education lag behind (Lucas & Grinberg, 2008). Research that describes the attitudes and practices of teachers who serve LDS and how those attitudes and practice intersect with language and special education is…

Greenfield, Renee A.




Microsoft Academic Search

The prevalence of complex acoustic structures in mammalian vocalisations can make it difficult to quantify frequency characteristics. We describe two methods developed for the frequency analysis of a complex swift fox Vulpes velox vocalisation, the barking sequence: (1) autocorrelation function analysis and (2) instantaneous frequency analysis. The autocorrelation function analysis results in an energy density spectrum of the signal's averaged




In-Service Teacher Training in Japan and Turkey: A Comparative Analysis of Institutions and Practices  

ERIC Educational Resources Information Center

The purpose of this study is to compare policies and practices relating to teacher in-service training in Japan and Turkey. On the basis of the findings of the study, suggestions are made about in-service training activities in Turkey. The research was carried using qualitative research methods. In-service training activities in the two education…

Bayrakci, Mustafa



Interpreting the Meaning of Grades: A Descriptive Analysis of Middle School Teachers' Assessment and Grading Practices  

ERIC Educational Resources Information Center

This descriptive, non-experimental, quantitative study was designed to answer the broad question, "What do grades mean?" Core academic subject middle school teachers from one large, suburban school district in Virginia were administered an electronic survey that asked them to report on aspects of their grading practices and assessment methods for…

Grimes, Tameshia Vaden



A Qualitative Analysis of an Advanced Practice Nurse-Directed Transitional Care Model Intervention  

ERIC Educational Resources Information Center

Purpose: The purpose of this study was to describe barriers and facilitators to implementing a transitional care intervention for cognitively impaired older adults and their caregivers lead by advanced practice nurses (APNs). Design and Methods: APNs implemented an evidence-based protocol to optimize transitions from hospital to home. An…

Bradway, Christine; Trotta, Rebecca; Bixby, M. Brian; McPartland, Ellen; Wollman, M. Catherine; Kapustka, Heidi; McCauley, Kathleen; Naylor, Mary D.



objectives study assess knowledge practices associated pesticide agricultural community Palestine determine prevalence self-reported health symptoms related pesticide exposure. Methods cross-sectional questionnaire study agricultural farm workers Nablus district Palestine interviewed knowledge practices pesticide use.  

EPA Pesticide Factsheets

Search instead for objectives study assess knowledge practices associated pesticide agricultural community Palestine determine prevalence self-reported health symptoms related pesticide exposure. Methods cross-sectional questionnaire study agricultural farm workers Nablus district Palestine interviewed knowledge practices pesticide use. ?


Case Report: Using Social Network Analysis within a Department of Biomedical Informatics to Induce a Discussion of Academic Communities of Practice  

Microsoft Academic Search

In order to assess the mission and strategic direction in an academic department of biomedical informatics, we used social network analysis to identify patterns of common interest among the department's multidisciplinary faculty. Data representing faculty and their self-identified research methods and expertise were analyzed by applying a network modularity algorithm to detect community structure. Three distinct communities of practice emerged:

Jacqueline Merrill; George Hripcsak



Numerical analysis of the symmetric methods  

Microsoft Academic Search

Aimed at the initial value problem of the particular second-order ordinary differential equations,y?=f(x, y), the symmetric methods (Quinlan and Tremaine, 1990) and our methods (Xu and Zhang, 1994) have been compared in detail by integrating the artificial earth satellite orbits in this paper. In the end, we point out clearly that the integral accuracy of numerical integration of the satellite

Ji-Hong Xu; A-Li Zhang



Initiating and managing risk assessments within a risk analysis framework: FDA/CFSAN'S practical approach.  


Management of risk analysis involves the integration and coordination of activities associated with risk assessment, risk management, and risk communication. Risk analysis is used to guide regulatory decision making, including trade decisions at national and international levels. The U.S. Food and Drug Administration Center for Food Safety and Applied Nutrition (CFSAN) formed a working group to evaluate and improve the quality and consistency of major risk assessments conducted by the Center. Drawing on risk analysis experiences, CFSAN developed a practical framework for initiating and managing risk assessments, including addressing issues related to (i) commissioning a risk assessment, (ii) interactions between risk managers and risk assessors, and (iii) peer review. PMID:15453602

Buchanan, Robert L; Dennis, Sherri; Miliotis, Marianne



Obesity in social media: a mixed methods analysis.  


The escalating obesity rate in the USA has made obesity prevention a top public health priority. Recent interventions have tapped into the social media (SM) landscape. To leverage SM in obesity prevention, we must understand user-generated discourse surrounding the topic. This study was conducted to describe SM interactions about weight through a mixed methods analysis. Data were collected across 60 days through SM monitoring services, yielding 2.2 million posts. Data were cleaned and coded through Natural Language Processing (NLP) techniques, yielding popular themes and the most retweeted content. Qualitative analyses of selected posts add insight into the nature of the public dialogue and motivations for participation. Twitter represented the most common channel. Twitter and Facebook were dominated by derogatory and misogynist sentiment, pointing to weight stigmatization, whereas blogs and forums contained more nuanced comments. Other themes included humor, education, and positive sentiment countering weight-based stereotypes. This study documented weight-related attitudes and perceptions. This knowledge will inform public health/obesity prevention practice. PMID:25264470

Chou, Wen-Ying Sylvia; Prestin, Abby; Kunath, Stephen



Fatigue research for rotating mirror of ultra-high speed camera through numerical analysis and experimental methods  

Microsoft Academic Search

It is the major failure mode of high-cycle fatigue for rotating mirror. Test methods for fatigue are commonly used in researching the fatigue life of rotating mirror, but not practically. In this paper, numerical analysis and experimental were used for researching the fatigue life of rotating mirror. With the finite element analysis software ANSYS, a static strength about the rotating

Chun-Bo Li; Chun-Hui Yu; Chun-Ping Liu; Jin-Long Chai; Hong-Zhi Wang; Jing-Zhen Li; Hong-Bin Huang



Technology transfer through a network of standard methods and recommended practices - The case of petrochemicals  

NASA Astrophysics Data System (ADS)

Technology transfer may take place in parallel with cooperative action between companies participating in the same organizational scheme or using one another as subcontractor (outsourcing). In this case, cooperation should be realized by means of Standard Methods and Recommended Practices (SRPs) to achieve (i) quality of intermediate/final products according to specifications and (ii) industrial process control as required to guarantee such quality with minimum deviation (corresponding to maximum reliability) from preset mean values of representative quality parameters. This work deals with the design of the network of SRPs needed in each case for successful cooperation, implying also the corresponding technology transfer, effectuated through a methodological framework developed in the form of an algorithmic procedure with 20 activity stages and 8 decision nodes. The functionality of this methodology is proved by presenting the path leading from (and relating) a standard test method for toluene, as petrochemical feedstock in the toluene diisocyanate production, to the (6 generations distance upstream) performance evaluation of industrial process control systems (ie., from ASTM D5606 to BS EN 61003-1:2004 in the SRPs network).

Batzias, Dimitris F.; Karvounis, Sotirios



Practical methods for using vegetation patterns to estimate avalanche frequency and magnitude  

NASA Astrophysics Data System (ADS)

Practitioners working in avalanche terrain may never witness an extreme event, but understanding extreme events is important for categorizing avalanches that occur within a given season. Historical records of avalanche incidents and direct observations are the most reliable evidence of avalanche activity, but patterns in vegetation can be used to further quantify and map the frequency and magnitude of past events. We surveyed published literature to synthesize approaches for using vegetation sampling to characterize avalanche terrain, and developed examples to identify the benefits and caveats of using different practical field methods to estimate avalanche frequency and magnitude. Powerful avalanches can deposit massive piles of snow, rocks, and woody debris in runout zones. Large avalanches (relative to the path) can cut fresh trimlines, widening their tracks by uprooting, stripping, and breaking trees. Discs and cores can be collected from downed trees to detect signals of past avalanche disturbance recorded in woody plant tissue. Signals of disturbance events recorded in tree rings can include direct impact scars from the moving snow and wind blast, development of reaction wood in response to tilting, and abrupt variation in the relative width of annual growth rings. The relative ages of trees in avalanche paths and the surrounding landscape can be an indicator of the area impacted by past avalanches. Repeat photography can also be useful to track changes in vegetation over time. For Colorado, and perhaps elsewhere, several vegetation ecology methods can be used in combination to accurately characterize local avalanche frequency and magnitude.

Simonson, S.; Fassnacht, S. R.



Image Analysis Using Multigrid Relaxation Methods  

Microsoft Academic Search

Image analysis problems, posed mathematically as variational principles or as partial differential equations, are amenable to numerical solution by relaxation algorithms that are local, iterative, and often parallel. Although they are well suited structurally for implementation on massively parallel, locally interconnected computational architectures, such distributed algorithms are seriously handi capped by an inherent inefficiency at propagating constraints between widely separated

Demetri Terzopoulos



Analysis of Two Methods to Evaluate Antioxidants  

ERIC Educational Resources Information Center

This exercise is intended to introduce undergraduate biochemistry students to the analysis of antioxidants as a biotechnological tool. In addition, some statistical resources will also be used and discussed. Antioxidants play an important metabolic role, preventing oxidative stress-mediated cell and tissue injury. Knowing the antioxidant content…

Tomasina, Florencia; Carabio, Claudio; Celano, Laura; Thomson, Leonor




E-print Network

Centre, University of Kuopio, Finland H.A. Soriyan & K. C. Olufokunbi, Dept. of Computer Science Analysis and Development fairly extensively. In the second part we compare work development information technology projects are currently one of the most common sources of change in workplaces

Bertelsen, Olav W.



EPA Science Inventory

The report provides information on coal sampling and analysis (CSD) techniques and procedures and presents a statistical model for estimating SO2 emissions. (New Source Performance Standards for large coal-fired boilers and certain State Implementation Plans require operators to ...


Decomposition Methods for Fault Tree Analysis  

Microsoft Academic Search

Some kinds of fault tree analysis are described for which cut set enumeration is inadequate. Modularization leads to more efficient computer programs, and also identifies subsystems which are intuitively meaningful. The problem of finding all modules of a fault tree is formulated as as extension of the problem of finding all ``cut-points'' of an undirected graph. The major result is

Arnon Rosenthal



Environmental Impact Analysis: Philosophy and Methods.  

ERIC Educational Resources Information Center

Proceedings of the Conference on Environmental Impact Analysis held in Green Bay, Wisconsin, January 4-5, 1972, are compiled in this report. The conference served as a forum for exchange of information among State and Federal agencies and educators on experiences with the National Environmental Policy Act of 1970. Hopefully, results of the…

Ditton, Robert B.; Goodale, Thomas L.



Microsoft Academic Search

Classical reliability methods such as First- and Second-Order Reliability Methods (FORM and SORM) have been important breakthroughs toward feasible and reliable integration of probabilistic information and uncertainty analysis into advanced design methods and modern design codes. These methods have been successfully used in solving challenging reliability problems. Nevertheless, caution should be used in the applications of these methods since their

M. Barbato; J. P. Conte



Methods for sampling and inorganic analysis of coal  

USGS Publications Warehouse

Methods used by the U.S. Geological Survey for the sampling, comminution, and inorganic analysis of coal are summarized in this bulletin. Details, capabilities, and limitations of the methods are presented.

Edited by Golightly, D. W.; Simon, Frederick Otto.



Common cause analysis : a review and extension of existing methods  

E-print Network

The quantitative common cause analysis code, MOBB, is extended to include uncertainties arising from modelling uncertainties and data uncertainties. Two methods, Monte Carlo simulation and the Method-of-Moments are used ...

Heising, Carolyn D.



Promoting recovery-oriented practice in mental health services: a quasi-experimental mixed-methods study  

PubMed Central

Background Recovery has become an increasingly prominent concept in mental health policy internationally. However, there is a lack of guidance regarding organisational transformation towards a recovery orientation. This study evaluated the implementation of recovery-orientated practice through training across a system of mental health services. Methods The intervention comprised four full-day workshops and an in-team half-day session on supporting recovery. It was offered to 383 staff in 22 multidisciplinary community and rehabilitation teams providing mental health services across two contiguous regions. A quasi-experimental design was used for evaluation, comparing behavioural intent with staff from a third contiguous region. Behavioural intent was rated by coding points of action on the care plans of a random sample of 700 patients (400 intervention, 300 control), before and three months after the intervention. Action points were coded for (a) focus of action, using predetermined categories of care; and (b) responsibility for action. Qualitative inquiry was used to explore staff understanding of recovery, implementation in services and the wider system, and the perceived impact of the intervention. Semi-structured interviews were conducted with 16 intervention group team leaders post-training and an inductive thematic analysis undertaken. Results A total of 342 (89%) staff received the intervention. Care plans of patients in the intervention group had significantly more changes with evidence of change in the content of patient’s care plans (OR 10.94. 95% CI 7.01-17.07) and the attributed responsibility for the actions detailed (OR 2.95, 95% CI 1.68-5.18). Nine themes emerged from the qualitative analysis split into two superordinate categories. ‘Recovery, individual and practice’, describes the perception and provision of recovery orientated care by individuals and at a team level. It includes themes on care provision, the role of hope, language of recovery, ownership and multidisciplinarity. ‘Systemic implementation’, describes organizational implementation and includes themes on hierarchy and role definition, training approaches, measures of recovery and resources. Conclusions Training can provide an important mechanism for instigating change in promoting recovery-orientated practice. However, the challenge of systemically implementing recovery approaches requires further consideration of the conceptual elements of recovery, its measurement, and maximising and demonstrating organizational commitment. PMID:23764121



Applying the 5-Step Method to Children and Affected Family Members: Opportunities and Challenges within Policy and Practice  

ERIC Educational Resources Information Center

The main aim of this article is to consider how the 5-Step Method could be developed to meet the needs of affected family members (AFMs) with children under the age of 18. This would be an entirely new development. This article examines opportunities and challenges within practice and policy and makes suggestions on how the Method could be taken…

Harwin, Judith



Enriching Careers and Lives: Introducing a Positive, Holistic, and Narrative Career Counseling Method that Bridges Theory and Practice  

ERIC Educational Resources Information Center

CareerCycles (CC) career counseling framework and method of practice integrates and builds on aspects of positive psychology. Through its holistic and narrative approach, the CC method seeks to collaboratively identify and understand clients' career and life stories. It focuses on their strengths, desires, preferences, assets, future…

Zikic, Jelena; Franklin, Mark



Method transfer, partial validation, and cross validation: recommendations for best practices and harmonization from the global bioanalysis consortium harmonization team.  


This paper presents the recommendations of the Global Bioanalytical Consortium Harmonization Team on method transfer, partial validation, and cross validation. These aspects of bioanalytical method validation, while important, have received little detailed attention in recent years. The team has attempted to define, separate, and describe these related activities, and present practical guidance in how to apply these techniques. PMID:25190270

Briggs, R J; Nicholson, R; Vazvaei, F; Busch, J; Mabuchi, M; Mahesh, K S; Brudny-Kloeppel, M; Weng, N; Galvinas, P A R; Duchene, P; Hu, Pei; Abbott, R W




E-print Network

CONVERGENCE ANALYSIS OF SAMPLING METHODS FOR PERTURBED LIPSCHITZ FUNCTIONS D. E. FINKEL AND C. T to a wide variety of deterministic sampling methods. For bound-constrained problems, we show that any method function. This enables us to apply the theory to the paradigm which motivates sampling methods

Kelley, C. T. "Tim"


Adaptive computational methods for aerothermal heating analysis  

NASA Technical Reports Server (NTRS)

The development of adaptive gridding techniques for finite-element analysis of fluid dynamics equations is described. The developmental work was done with the Euler equations with concentration on shock and inviscid flow field capturing. Ultimately this methodology is to be applied to a viscous analysis for the purpose of predicting accurate aerothermal loads on complex shapes subjected to high speed flow environments. The development of local error estimate strategies as a basis for refinement strategies is discussed, as well as the refinement strategies themselves. The application of the strategies to triangular elements and a finite-element flux-corrected-transport numerical scheme are presented. The implementation of these strategies in the GIM/PAGE code for 2-D and 3-D applications is documented and demonstrated.

Price, John M.; Oden, J. Tinsley



Analysis Resistant Cipher Method and Apparatus  

NASA Technical Reports Server (NTRS)

A system for encoding and decoding data words including an anti-analysis encoder unit for receiving an original plaintext and producing a recoded data, a data compression unit for receiving the recoded data and producing a compressed recoded data, and an encryption unit for receiving the compressed recoded data and producing an encrypted data. The recoded data has an increased non-correlatable data redundancy compared with the original plaintext in order to mask the statistical distribution of characters in the plaintext data. The system of the present invention further includes a decryption unit for receiving the encrypted data and producing a decrypted data, a data decompression unit for receiving the decrypted data and producing an uncompressed recoded data, and an anti-analysis decoder unit for receiving the uncompressed recoded data and producing a recovered plaintext that corresponds with the original plaintext.

Oakley, Ernest C. (Inventor)



Learning in the Permaculture Community of Practice in England: An Analysis of the Relationship between Core Practices and Boundary Processes  

ERIC Educational Resources Information Center

Purpose: This article utilizes the Communities of Practice (CoP) framework to examine learning processes among a group of permaculture practitioners in England, specifically examining the balance between core practices and boundary processes. Design/methodology/approach: The empirical basis of the article derives from three participatory workshops…

Ingram, Julie; Maye, Damian; Kirwan, James; Curry, Nigel; Kubinakova, Katarina



Sensitivity analysis of hot channel calculation methods  

Microsoft Academic Search

In safety analysis, the fulfillment of acceptance criteria is usually evaluated by separate hot channel or\\/and hot assembly thermal hydraulic\\/fuel behavior calculations. The whole range of the relevant input parameters (e.g. power distributions, burnup, heat conduction data, inlet temperature, etc.) must be taken into account. Concerning these parameters, the most frequent conservative approach is to select the limiting values, partly

I. Panka; M. Telbisz



Digital Forensics Analysis of Spectral Estimation Methods  

Microsoft Academic Search

Steganography is the art and science of writing hidden messages in such a way that no one apart from the intended recipient knows of the existence of the message. In today’s world, it is widely used in order to secure the information. Since digital forensics aims to detect, recover and examine the digital evidence and steganography is a method for

Tolga Mataracioglu; Unal Tatar



Numerical error analysis of direct integration method  

Microsoft Academic Search

Numerical errors of PALLAS calculation due to special mesh size sare examined for a typical deep penetration shielding problem of isotropic incident fission neutrons penetrating a 200-cm-thick water slab. The exponential approximation for the source spatial distribution to solve the transport equation based on the direct integration method is verified to be effective for radition transport in attenuating medium, while

K. Takeuchi; N. Sasamoto



Analysis methods of human cell migration  

Microsoft Academic Search

The autonomous migration of specialized cells is an essential characteristic in both physiological and pathological functions in the adult human organism. Leukocytes, fibroblasts, and stem cells, but also tumor cells, are thus the subject of intense investigation in a broad range of research fields. A wide spectrum of methods have therefore been established to analyze chemokinetic and chemotactic cell migration,

Frank Entschladen; Theodore L. Drell; Kerstin Lang; Kai Masur; Daniel Palm; Philipp Bastian; Bernd Niggemann; Kurt S. Zaenker



Use of asymptotic methods in vibration analysis  

Microsoft Academic Search

The derivation of dynamic differential equations, suitable for studying the vibrations of rotating, curved, slender structures was examined, and the Hamiltonian procedure was advocated for this purpose. Various reductions of the full system are displayed, which govern the vibrating troposkien when various order of magnitude restrictions are placed on important parameters. Possible advantages of the WKB asymptotic method for solving

H. Ashley



Methods of soil P analysis in archaeology  

Microsoft Academic Search

Phosphorus (P) is unique among the elements in being a sensitive and persistent indicator of human activity. It has long been of interest to archaeologists because of its potential to inform them about the presence of past human occupation and to offer clues regarding the type and intensity of human activity. A wide variety of methods have been developed in

Vance T. Holliday; William G. Gartner



The Ford Method: A Sensitivity Analysis Approach  

Microsoft Academic Search

In dynamic models, a system behavior is determined by the int eraction of its feedback loops. The challenge for system dynamics modellers is to identify t hese loops, and also understand, over the runtime of a model, which loops dominate system behavior . The Ford method is a procedure that identifies changes in atomic behavior patterns in the pr esence,

Jinjing Huang; Enda Howley; Jim Duggan


A conic-section simulation analysis of two-dimensional fracture problems using the finite element method  

Microsoft Academic Search

A conic-section simulation analysis to determine the stress intensity factors for fracture mechanics problems of practical interest using the finite element method is presented. The method makes use of elliptic displacement functions which are satisfied by the introduction of an “equivalent ellipse” obtained through first simulating the actual crack surface displacements as a part of a parabola or a hyperbola.

C. L. Chow; K. J. Lau



The Pseudospectral Method and Discrete Spectral Analysis  

Microsoft Academic Search

\\u000a \\u000a One of the focal points of the research at the Centre for Nonlinear Studies is related to the numerical simulation of the\\u000a emergence, propagation and interaction of solitary waves and solitons in nonlinear dispersive media. Based on the discrete\\u000a Fourier transform the pseudospectral method can be used for the numerical integration of the model equations, and the Fourier\\u000a transform related

Andrus Salupere


Nurses’ self-efficacy and practices relating to weight management of adult patients: a path analysis  

PubMed Central

Background Health professionals play a key role in the prevention and treatment of excess weight and obesity, but many have expressed a lack of confidence in their ability to manage obese patients with their delivery of weight-management care remaining limited. The specific mechanism underlying inadequate practices in professional weight management remains unclear. The primary purpose of this study was to examine a self-efficacy theory-based model in understanding Registered Nurses’ (RNs) professional performance relating to weight management. Methods A self-report questionnaire was developed based upon the hypothesized model and administered to a convenience sample of 588 RNs. Data were collected regarding socio-demographic variables, psychosocial variables (attitudes towards obese people, professional role identity, teamwork beliefs, perceived skills, perceived barriers and self-efficacy) and professional weight management practices. Structural equation modeling was conducted to identify correlations between the above variables and to test the goodness of fit of the proposed model. Results The survey response rate was 71.4% (n?=?420). The respondents reported a moderate level of weight management practices. Self-efficacy directly and positively predicted the weight management practices of the RNs (??=?0.36, p?practices. The final model constructed in this study demonstrated a good fit to the data [?2 (14) =13.90, p?=?0.46; GFI?=?0.99; AGFI?=?0.98; NNFI?=?1.00; CFI?=?1.00; RMSEA?=?0.00; AIC?=?57.90], accounting for 38.4% and 43.2% of the variance in weight management practices and self-efficacy, respectively. Conclusions Self-efficacy theory appears to be useful in understanding the weight management practices of RNs. Interventions targeting the enhancement of self-efficacy may be effective in promoting RNs’ professional performance in managing overweight and obese patients. PMID:24304903




SciTech Connect

The most common method of analysis for beryllium is inductively coupled plasma atomic emission spectrometry (ICP-AES). This method, along with inductively coupled plasma mass spectrometry (ICP-MS), is discussed in Chapter 6. However, other methods exist and have been used for different applications. These methods include spectroscopic, chromatographic, colorimetric, and electrochemical. This chapter provides an overview of beryllium analysis methods other than plasma spectrometry (inductively coupled plasma atomic emission spectrometry or mass spectrometry). The basic methods, detection limits and interferences are described. Specific applications from the literature are also presented.

Ekechukwu, A



A review of rapid methods for the analysis of mycotoxins.  


An overview is presented of the analysis of mycotoxins by rapid methods such as: enzyme linked immunosorbent assay (ELISA); flow through membrane based immunoassay; immunochromatographic assay; fluorometric assay with immunoaffinity clean-up column or with a solid phase extraction clean-up column; and fluorescence polarization method. These methods are currently commercially available and are reliable, rapid methods. This review focuses on the basic principle of each rapid method as well as advantages and limitations of each method. Additionally, we address other emerging technologies of potential application in the analysis of mycotoxins. PMID:16649076

Zheng, Michael Z; Richard, John L; Binder, Johann



Implementing a Virtual Community of Practice for Family Physician Training: A Mixed-Methods Case Study  

PubMed Central

Background GP training in Australia can be professionally isolating, with trainees spread across large geographic areas, leading to problems with rural workforce retention. Virtual communities of practice (VCoPs) may provide a way of improving knowledge sharing and thus reducing professional isolation. Objective The goal of our study was to review the usefulness of a 7-step framework for implementing a VCoP for general practitioner (GP) training and then evaluated the usefulness of the resulting VCoP in facilitating knowledge sharing and reducing professional isolation. Methods The case was set in an Australian general practice training region involving 55 first-term trainees (GPT1s), from January to July 2012. ConnectGPR was a secure, online community site that included standard community options such as discussion forums, blogs, newsletter broadcasts, webchats, and photo sharing. A mixed-methods case study methodology was used. Results are presented and interpreted for each step of the VCoP 7-step framework and then in terms of the outcomes of knowledge sharing and overcoming isolation. Results Step 1, Facilitation: Regular, personal facilitation by a group of GP trainers with a co-ordinating facilitator was an important factor in the success of ConnectGPR. Step 2, Champion and Support: Leadership and stakeholder engagement were vital. Further benefits are possible if the site is recognized as contributing to training time. Step 3, Clear Goals: Clear goals of facilitating knowledge sharing and improving connectedness helped to keep the site discussions focused. Step 4, A Broad Church: The ConnectGPR community was too narrow, focusing only on first-term trainees (GPT1s). Ideally there should be more involvement of senior trainees, trainers, and specialists. Step 5, A Supportive Environment: Facilitators maintained community standards and encouraged participation. Step 6, Measurement Benchmarking and Feedback: Site activity was primarily driven by centrally generated newsletter feedback. Viewing comments by other participants helped users benchmark their own knowledge, particularly around applying guidelines. Step 7, Technology and Community: All the community tools were useful, but chat was limited and users suggested webinars in future. A larger user base and more training may also be helpful. Time is a common barrier. Trust can be built online, which may have benefit for trainees that cannot attend face-to-face workshops. Knowledge sharing and isolation outcomes: 28/34 (82%) of the eligible GPT1s enrolled on ConnectGPR. Trainees shared knowledge through online chat, forums, and shared photos. In terms of knowledge needs, GPT1s rated their need for cardiovascular knowledge more highly than supervisors. Isolation was a common theme among interview respondents, and ConnectGPR users felt more supported in their general practice (13/14, 92.9%). Conclusions The 7-step framework for implementation of an online community was useful. Overcoming isolation and improving connectedness through an online knowledge sharing community shows promise in GP training. Time and technology are barriers that may be overcome by training, technology, and valuable content. In a VCoP, trust can be built online. This has implications for course delivery, particularly in regional areas. VCoPs may also have a specific role assisting overseas trained doctors to interpret their medical knowledge in a new context. PMID:24622292

Jones, Sandra C; Caton, Tim; Iverson, Don; Bennett, Sue; Robinson, Laura




E-print Network

ON THE RECURRENCE PLOT ANALYSIS METHOD BEHAVIOUR UNDER SCALING TRANSFORM F.-M. Birleanua,d , C, University of Pitesti, Romania ABSTRACT In the last decade, the applications of the recurrence plot analy the behaviour of the recurrence plot analysis method in the context of analyzing some finite duration signals

Paris-Sud XI, Université de


Application of integrated fluid-thermal-structural analysis methods  

Microsoft Academic Search

Hypersonic vehicles operate in a hostile aerothermal environment which has a significant impact on their aerothermostructural performance. Significant coupling occurs between the aerodynamic flow field, structural heat transfer, and structural response creating a multidisciplinary interaction. Interfacing state-of-the-art disciplinary analysis methods is not efficient, hence interdisciplinary analysis methods integrated into a single aerothermostructural analyzer are needed. The NASA Langley Research Center

A. R. Wieting; P. Dechaumphai; K. S. Bey; E. A. Thornton; K. Morgan



Research on the demodulation method based on the wavelet analysis  

Microsoft Academic Search

Based on wavelet analysis method and Hilbert envelop demodulation, a method for the fault diagnosis of rolling bearings is proposed in this paper. The local Hilbert demodulation and local wavelet transform are introduced respectively. The wavelet transform is used to translate vibration signals into time-scale representation. Then, an envelope signal can be obtained by envelop spectrum analysis of wavelet coefficients

Ling-Li Cui; Li-Xin Gao; Guo-Dong Wang



A Critical Appraisal of Current Practice in the Detection, Analysis, and Reporting of Cryoglobulins  

Microsoft Academic Search

To assess current practice in the detection, analysis, and reporting of cryoglobulins, a questionnaire was sent to 140 laboratories. Only 36% of laboratories used standard proce- dures(tubepreheating,transportincontainer,andsedimen- tationand\\/orcentrifugationat37 °C)toensurethatthetem- perature did not drop below 37 °C until after serum separation. Time periods allowed for cryoprecipitation at 4 °C varied from 12 h to 9 days, with 30% of laboratories allowing

Pieter Vermeersch; Koenraad Gijbels; Godelieve Marien; Rod Lunn; William Egner; Peter White; Xavier Bossuyt


An experimental study of practical computerized scatter correction methods for prototype digital breast tomosynthesis  

NASA Astrophysics Data System (ADS)

Digital breast tomosynthesis (DBT) is a technique developed to overcome the limitations of conventional digital mammography by reconstructing slices through the breast from projections acquired at different angles. In developing and optimizing DBT, the x-ray scatter reduction technique remains a significant challenge due to projection geometry and radiation dose limitations. The most common approach for scatter reduction technique is a beam-stop-array (BSA) algorithm while this method has a concern of additional exposure to acquire the scatter distribution. The compressed breast is roughly symmetry and the scatter profiles from projection acquired at axially opposite angle are similar to mirror image from each other. The purpose of this study was to apply the BSA algorithm acquiring only two scans with a beam stop array, which estimates scatter distribution with minimum additional exposure. The results of scatter correction with angular interpolation were comparable to those of scatter correction with all scatter distributions at each angle and exposure increase was less than 13%. This study demonstrated the influence of scatter correction by BSA algorithm with minimum exposure which indicates the practical application in clinical situations.

Kim, Y.; Kim, H.; Park, H.; Choi, J.; Choi, Y.



Tournament Methods for WLAN: Analysis and Efficiency  

NASA Astrophysics Data System (ADS)

In the context of radio distributed networks, we present a generalized approach for Medium Access Control (MAC) with a fixed congestion window. Our protocol is quite simple to analyze and can be used in a lot of different situations. We give mathematical evidence showing that our performance is asymptotically tight. We also place ourselves in the WiFi and WiMAX frameworks, and discuss experimental results showing acollision reduction of 14% to 21% compared to the best-known methods. We discuss channel capacity improvement and fairness considerations.

Galtier, Jérôme


Modern Statistical Methods for GLAST Event Analysis  

E-print Network

We describe a statistical reconstruction methodology for the GLAST LAT. The methodology incorporates in detail the statistics of the interactions of photons and charged particles with the tungsten layers in the LAT, and uses the scattering distributions to compute the full probability distribution over the energy and direction of the incident photons. It uses model selection methods to estimate the probabilities of the possible geometrical configurations of the particles produced in the detector, and numerical marginalization over the energy loss and scattering angles at each layer. Preliminary results show that it can improve on the tracker-only energy estimates for muons and electrons incident on the LAT.

Robin D. Morris; Johann Cohen-Tanugi



The politics of historical discourse analysis: a qualitative research method?  

Microsoft Academic Search

This article deals with the ways in which historical discourse analysis is at once different from and similar to research described as qualitative or quantitative. It discusses the consequences of applying the standards of such methods to historical discourse analysis. It is pointed out that although the merit of research using historical discourse analysis must not be judged by the

Ingólfur Ásgeir Jóhannesson



Participant Interaction in Asynchronous Learning Environments: Evaluating Interaction Analysis Methods  

ERIC Educational Resources Information Center

The purpose of this empirical study was to determine the extent to which three different objective analytical methods--sequence analysis, surface cohesion analysis, and lexical cohesion analysis--can most accurately identify specific characteristics of online interaction. Statistically significant differences were found in all points of…

Blanchette, Judith



Methods for Evidence-Based Practice: Quantitative Synthesis of Single-Subject Designs  

ERIC Educational Resources Information Center

Good quantitative evidence does not require large, aggregate group designs. The authors describe ground-breaking work in managing the conceptual and practical demands in developing meta-analytic strategies for single subject designs in an effort to add to evidence-based practice. (Contains 2 figures.)

Shadish, William R.; Rindskopf, David M.



Diversity and evolution of methods and practices for the molecular diagnosis of congenital toxoplasmosis in France: a 4-year survey.  


The prenatal diagnosis of congenital toxoplasmosis is currently based upon molecular biology using a sample of amniotic fluid. The vast majority of centres globally (and all centres in France) performing this diagnosis use 'in house' or laboratory-developed PCR assays. This may be the source of considerable inter-laboratory variation in the performances of the assays, hampering any valuable comparison of data among different centres. The present study was based upon questionnaires that were sent to 21-25 centres between 2002 and 2005 enquiring about methods and practices of the PCR-based prenatal diagnosis of congenital toxoplasmosis. An extreme diversity of PCR methods and practices was observed. Thus, in 2005, 35 PCR methods, differing in one of the main steps of the whole process, were reported as being in use for routine diagnosis, with nine centres using two or three methods. We provide comprehensive information on the extraction methods, DNA targets, primer pairs and detection methods used for this diagnosis, as well as their evolution, during the period of study. Interestingly, in this period (2002-2005), a rapid progression of the number of laboratories using real-time PCR technology, which increased from four to 19, was observed. We also studied general PCR practices concerning, for example, the number of reaction tubes used for each biological sample and the inclusion of controls. The return of information in a yearly report provided the opportunity for writing proposals aiming to improve laboratory practices for this diagnosis at the national level. The high diversity of methods and practices currently used emphasizes the need for external quality assessment of the performances of the molecular diagnostic methods. PMID:19886905

Sterkers, Y; Varlet-Marie, E; Marty, P; Bastien, P



Analysis of hemoglobin electrophoresis results and physicians investigative practices in Saudi Arabia  

PubMed Central

BACKGROUND AND OBJECTIVES: Riyadh and central province falls in a moderate prevalent zone of hemoglobinopathies in Saudi Arabia. However, it has been observed that the physicians working in Saudi Arabia invariably advise all cases of anemia for hemoglobin electrophoresis (HE). The present work was carried out to study the yield of the HE in Riyadh and the investigative practices of the physicians advising HE. SETTINGS AND DESIGN: The study was carried out in the hospitals of King Saud University from 2009 to 2011 in order to assess the yield of HE in referred cases of clinical anemia. MATERIALS AND METHODS: A total of 1073 cases divided in two groups of males and females had undergone complete blood count and red blood cell morphology. Cellulose acetate HE was performed and all the positive results were reconfirmed on the high performance liquid chromatography (HPLC). The results were analyzed for the type of hemoglobinopathies. For statistical analysis Statistical Package for Social Sciences 15 version (SPSS Inc., Chicago, IL, USA) was used. RESULTS: A total of 405 males and 668 females blood samples were included in the present study. 116 (28.5%) males and 167 (25%) females showed an abnormal pattern on HE. The incidence of beta thalassemia trait was higher in females while sickle cell trait was predominantly seen in males. Red cell indices were reduced considerably in thalassemias, but were unaffected in sickle cell disorders, except those which had concurrent alpha trait. The total yield of HE was 26.6% which was much less than expected. CONCLUSION: The physicians are advised to rule out iron deficiency and other common causes of anemia before investigating the cases for hemoglobinopathies, which employs time consuming and expensive tests of HE and HPLC. PMID:24339548

Mehdi, Syed Riaz; Al Dahmash, Badr Abdullah



Implementation of infection control best practice in intensive care units throughout Europe: a mixed-method evaluation study  

PubMed Central

Background The implementation of evidence-based infection control practices is essential, yet challenging for healthcare institutions worldwide. Although acknowledged that implementation success varies with contextual factors, little is known regarding the most critical specific conditions within the complex cultural milieu of varying economic, political, and healthcare systems. Given the increasing reliance on unified global schemes to improve patient safety and healthcare effectiveness, research on this topic is needed and timely. The ‘InDepth’ work package of the European FP7 Prevention of Hospital Infections by Intervention and Training (PROHIBIT) consortium aims to assess barriers and facilitators to the successful implementation of catheter-related bloodstream infection (CRBSI) prevention in intensive care units (ICU) across several European countries. Methods We use a qualitative case study approach in the ICUs of six purposefully selected acute care hospitals among the 15 participants in the PROHIBIT CRBSI intervention study. For sensitizing schemes we apply the theory of diffusion of innovation, published implementation frameworks, sensemaking, and new institutionalism. We conduct interviews with hospital health providers/agents at different organizational levels and ethnographic observations, and conduct rich artifact collection, and photography during two rounds of on-site visits, once before and once one year into the intervention. Data analysis is based on grounded theory. Given the challenge of different languages and cultures, we enlist the help of local interpreters, allot two days for site visits, and perform triangulation across multiple data sources. Qualitative measures of implementation success will consider the longitudinal interaction between the initiative and the institutional context. Quantitative outcomes on catheter-related bloodstream infections and performance indicators from another work package of the consortium will produce a final mixed-methods report. Conclusion A mixed-methods study of this scale with longitudinal follow-up is unique in the field of infection control. It highlights the ‘Why’ and ‘How’ of best practice implementation, revealing key factors that determine success of a uniform intervention in the context of several varying cultural, economic, political, and medical systems across Europe. These new insights will guide future implementation of more tailored and hence more successful infection control programs. Trial registration Trial number: PROHIBIT-241928 (FP7 reference number) PMID:23421909



Advanced stress analysis methods applicable to turbine engine structures  

NASA Technical Reports Server (NTRS)

Advanced stress analysis methods applicable to turbine engine structures are investigated. Constructions of special elements which containing traction-free circular boundaries are investigated. New versions of mixed variational principle and version of hybrid stress elements are formulated. A method is established for suppression of kinematic deformation modes. semiLoof plate and shell elements are constructed by assumed stress hybrid method. An elastic-plastic analysis is conducted by viscoplasticity theory using the mechanical subelement model.

Pian, T. H. H.



Structural Analysis Using Computer Based Methods  

NASA Technical Reports Server (NTRS)

The stiffness of a flex hose that will be used in the umbilical arms of the Space Launch Systems mobile launcher needed to be determined in order to properly qualify ground umbilical plate behavior during vehicle separation post T-0. This data is also necessary to properly size and design the motors used to retract the umbilical arms. Therefore an experiment was created to determine the stiffness of the hose. Before the test apparatus for the experiment could be built, the structure had to be analyzed to ensure it would not fail under given loading conditions. The design model was imported into the analysis software and optimized to decrease runtime while still providing accurate restlts and allow for seamless meshing. Areas exceeding the allowable stresses in the structure were located and modified before submitting the design for fabrication. In addition, a mock up of a deep space habitat and the support frame was designed and needed to be analyzed for structural integrity under different loading conditions. The load cases were provided by the customer and were applied to the structure after optimizing the geometry. Once again, weak points in the structure were located and recommended design changes were made to the customer and the process was repeated until the load conditions were met without exceeding the allowable stresses. After the stresses met the required factors of safety the designs were released for fabrication.

Dietz, Matthew R.



A simplified method for analysis of polyunsaturated fatty acids  

PubMed Central

Background Analysis of fatty acid composition of biological materials is a common task in lipid research. Conventionally, preparation of samples for fatty acid analysis by gas chromatography involves two separate procedures: lipid extraction and methylation. This conventional method is complicated, tedious and time consuming. Development of a rapid and simple method for lipid analysis is warranted. Results We simplified the conventional method by combining the extraction and methylation into a single step (omitting the procedure of prior extraction). Various biological samples including cultured cells, animal tissues and human specimens have been tested using the new method. Statistical analysis indicates that the recovery of long chain fatty acids from tissue samples by the simplified method is significantly higher than that by the traditional method, but there is no difference in relative fatty acid composition between the two methods. This simplified method can significantly save time and materials, and reduce the potentials of sample loss and contamination. Conclusion The lipid extraction procedure prior to methylation employed conventionally in lipid analysis can be omitted without affecting the recovery of long chain (? 18 C) fatty acids and their composition. The simplified method is rapid, easy-to-use, suitable for analysis of total long chain polyunsaturated fatty acid contents (e.g. n-6 and n-3 fatty acids) in various biological samples, especially when the number of samples to be analyzed is large and/or the specimen size is small. PMID:15790399

Kang, Jing X; Wang, Jingdong



The stages of implementation completion for evidence-based practice: protocol for a mixed methods study  

PubMed Central

Background This protocol describes the ‘development of outcome measures and suitable methodologies for dissemination and implementation approaches,’ a priority for implementation research. Although many evidence-based practices (EBPs) have been developed, large knowledge gaps remain regarding how to routinely move EBPs into usual care. The lack of understanding of ‘what it takes’ to install EBPs has costly public health consequences, including a lack of availability of the most beneficial services, wasted efforts and resources on failed implementation attempts, and the potential for engendering reluctance to try implementing new EBPs after failed attempts. The Stages of Implementation Completion (SIC) is an eight-stage tool of implementation process and milestones, with stages spanning three implementation phases (pre-implementation, implementation, sustainability). Items delineate the date that a site completes implementation activities, yielding an assessment of duration (time to complete a stage), proportion (of stage activities completed), and a general measure of how far a site moved in the implementation process. Methods/Design We propose to extend the SIC to EBPs operating in child service sectors (juvenile justice, schools, substance use, child welfare). Both successful and failed implementation attempts will be scrutinized using a mixed methods design. Stage costs will be measured and examined. Both retrospective data (from previous site implementation efforts) and prospective data (from newly adopting sites) will be analyzed. The influence of pre-implementation on implementation and sustainability outcomes will be examined (Aim 1). Mixed methods procedures will focus on increasing understanding of the process of implementation failure in an effort to determine if the SIC can provide early detection of sites that are unlikely to succeed (Aim 2). Study activities will include cost mapping of SIC stages and an examination of the relationship between implementation costs and implementation performance (Aim 3). Discussion This project fills a gap in the field of implementation science by addressing the measurement gap between the implementation process and the associated costs. The goal of this project is to provide tools that will help increase the uptake of EBPs, thereby increasing the availability of services to youth and decreasing wasted resources from failed implementation efforts. PMID:24708893



Vulnerability analysis methods for road networks  

NASA Astrophysics Data System (ADS)

Road networks rank among the most important lifelines of modern society. They can be damaged by either random or intentional events. Roads are also often affected by natural hazards, the impacts of which are both direct and indirect. Whereas direct impacts (e.g. roads damaged by a landslide or due to flooding) are localized in close proximity to the natural hazard occurrence, the indirect impacts can entail widespread service disabilities and considerable travel delays. The change in flows in the network may affect the population living far from the places originally impacted by the natural disaster. These effects are primarily possible due to the intrinsic nature of this system. The consequences and extent of the indirect costs also depend on the set of road links which were damaged, because the road links differ in terms of their importance. The more robust (interconnected) the road network is, the less time is usually needed to secure the serviceability of an area hit by a disaster. These kinds of networks also demonstrate a higher degree of resilience. Evaluating road network structures is therefore essential in any type of vulnerability and resilience analysis. There are a range of approaches used for evaluation of the vulnerability of a network and for identification of the weakest road links. Only few of them are, however, capable of simulating the impacts of the simultaneous closure of numerous links, which often occurs during a disaster. The primary problem is that in the case of a disaster, which usually has a large regional extent, the road network may remain disconnected. The majority of the commonly used indices use direct computation of the shortest paths or time between OD (origin - destination) pairs and therefore cannot be applied when the network breaks up into two or more components. Since extensive break-ups often occur in cases of major disasters, it is important to study the network vulnerability in these cases as well, so that appropriate steps can be taken in order to make it more resilient. Performing such an analysis of network break-ups requires consideration of the network as a whole, ideally identifying all the cases generated by simultaneous closure of multiple links and evaluating them using various criteria. The spatial distribution of settlements, important companies and the overall population in the nodes of the network are several factors, apart from the topology of the network which could be taken into account when computing vulnerability indices and identifying the weakest links and/or weakest link combinations. However, even for small networks (i.e., hundreds of nodes and links), the problem of break-up identification becomes extremely difficult to resolve. The naive approaches of the brute force examination consequently fail and more elaborated algorithms have to be applied. We address the problem of evaluating the vulnerability of road networks in our work by simulating the impacts of the simultaneous closure of multiple roads/links. We present an ongoing work on a sophisticated algorithm focused on the identification of network break-ups and evaluating them by various criteria.

Bíl, Michal; Vodák, Rostislav; Kube?ek, Jan; Rebok, Tomáš; Svoboda, Tomáš