Sample records for methods compared include

  1. System and method of designing models in a feedback loop

    DOEpatents

    Gosink, Luke C.; Pulsipher, Trenton C.; Sego, Landon H.

    2017-02-14

    A method and system for designing models is disclosed. The method includes selecting a plurality of models for modeling a common event of interest. The method further includes aggregating the results of the models and analyzing each model compared to the aggregate result to obtain comparative information. The method also includes providing the information back to the plurality of models to design more accurate models through a feedback loop.

  2. Comparing Two Methods for Reducing Variability in Voice Quality Measurements

    ERIC Educational Resources Information Center

    Kreiman, Jody; Gerratt, Bruce R.

    2011-01-01

    Purpose: Interrater disagreements in ratings of quality plague the study of voice. This study compared 2 methods for handling this variability. Method: Listeners provided multiple breathiness ratings for 2 sets of pathological voices, one including 20 male and 20 female voices unselected for quality and one including 20 breathy female voices.…

  3. Control methods and systems for indirect evaporative coolers

    DOEpatents

    Woods, Jason; Kozubal, Erik

    2015-09-22

    A control method for operating an indirect evaporative cooler to control temperature and humidity. The method includes operating an airflow control device to provide supply air at a flow rate to a liquid desiccant dehumidifier. The supply air flows through the dehumidifier and an indirect evaporative cooler prior to exiting an outlet into a space. The method includes operating a pump to provide liquid desiccant to the liquid desiccant dehumidifier and sensing a temperature of an airstream at the outlet of the indirect evaporative cooler. The method includes comparing the temperature of the airstream at the outlet to a setpoint temperature at the outlet and controlling the pump to set the flow rate of the liquid desiccant. The method includes sensing space temperature, comparing the space temperature with a setpoint temperature, and controlling the airflow control device to set the flow rate of the supply air based on the comparison.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Stephen F; Moore, James A

    Systems and methods are described for carrier-frequency synchronization for improved AM and TV broadcast reception. A method includes synchronizing a carrier frequency of a broadcast signal with a remote reference frequency. An apparatus includes a reference signal receiver; a phase comparator coupled to the reference signal receiver; a voltage controlled oscillator coupled to the phase comparator; and a radio frequency output coupled to the voltage controlled oscillator.

  5. Temperature analysis with voltage-current time differential operation of electrochemical sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woo, Leta Yar-Li; Glass, Robert Scott; Fitzpatrick, Joseph Jay

    A method for temperature analysis of a gas stream. The method includes identifying a temperature parameter of an affected waveform signal. The method also includes calculating a change in the temperature parameter by comparing the affected waveform signal with an original waveform signal. The method also includes generating a value from the calculated change which corresponds to the temperature of the gas stream.

  6. 75 FR 14170 - Medical Device Epidemiology Network: Developing Partnership Between the Center for Devices and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-24

    ... methods for medical device comparative analyses, best practices and best design and analysis methods. II... the performance of medical devices (including comparative effectiveness studies). The centers...

  7. The importance of quality control in validating concentrations ...

    EPA Pesticide Factsheets

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer

  8. Quantitative comparison of in situ soil CO2 flux measurement methods

    Treesearch

    Jennifer D. Knoepp; James M. Vose

    2002-01-01

    Development of reliable regional or global carbon budgets requires accurate measurement of soil CO2 flux. We conducted laboratory and field studies to determine the accuracy and comparability of methods commonly used to measure in situ soil CO2 fluxes. Methods compared included CO2...

  9. An assessment of unstructured grid technology for timely CFD analysis

    NASA Technical Reports Server (NTRS)

    Kinard, Tom A.; Schabowski, Deanne M.

    1995-01-01

    An assessment of two unstructured methods is presented in this paper. A tetrahedral unstructured method USM3D, developed at NASA Langley Research Center is compared to a Cartesian unstructured method, SPLITFLOW, developed at Lockheed Fort Worth Company. USM3D is an upwind finite volume solver that accepts grids generated primarily from the Vgrid grid generator. SPLITFLOW combines an unstructured grid generator with an implicit flow solver in one package. Both methods are exercised on three test cases, a wing, and a wing body, and a fully expanded nozzle. The results for the first two runs are included here and compared to the structured grid method TEAM and to available test data. On each test case, the set up procedure are described, including any difficulties that were encountered. Detailed descriptions of the solvers are not included in this paper.

  10. FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES

    DTIC Science & Technology

    2017-06-01

    FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and

  11. Method and system for determining the torque required to launch a vehicle having a hybrid drive-train

    DOEpatents

    Hughes, Douglas A.

    2006-04-04

    A method and system are provided for determining the torque required to launch a vehicle having a hybrid drive-train that includes at least two independently operable prime movers. The method includes the steps of determining the value of at least one control parameter indicative of a vehicle operating condition, determining the torque required to launch the vehicle from the at least one determined control parameter, comparing the torque available from the prime movers to the torque required to launch the vehicle, and controlling operation of the prime movers to launch the vehicle in response to the comparing step. The system of the present invention includes a control unit configured to perform the steps of the method outlined above.

  12. Supplementary search methods were more effective and offered better value than bibliographic database searching: A case study from public health and environmental enhancement.

    PubMed

    Cooper, Chris; Lovell, Rebecca; Husk, Kerryn; Booth, Andrew; Garside, Ruth

    2018-06-01

    We undertook a systematic review to evaluate the health benefits of environmental enhancement and conservation activities. We were concerned that a conventional process of study identification, focusing on exhaustive searches of bibliographic databases as the primary search method, would be ineffective, offering limited value. The focus of this study is comparing study identification methods. We compare (1) an approach led by searches of bibliographic databases with (2) an approach led by supplementary search methods. We retrospectively assessed the effectiveness and value of both approaches. Effectiveness was determined by comparing (1) the total number of studies identified and screened and (2) the number of includable studies uniquely identified by each approach. Value was determined by comparing included study quality and by using qualitative sensitivity analysis to explore the contribution of studies to the synthesis. The bibliographic databases approach identified 21 409 studies to screen and 2 included qualitative studies were uniquely identified. Study quality was moderate, and contribution to the synthesis was minimal. The supplementary search approach identified 453 studies to screen and 9 included studies were uniquely identified. Four quantitative studies were poor quality but made a substantive contribution to the synthesis; 5 studies were qualitative: 3 studies were good quality, one was moderate quality, and 1 study was excluded from the synthesis due to poor quality. All 4 included qualitative studies made significant contributions to the synthesis. This case study found value in aligning primary methods of study identification to maximise location of relevant evidence. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Carrier-frequency synchronization system for improved amplitude modulation and television broadcast reception

    DOEpatents

    Smith, Stephen F.; Moore, James A.

    2003-05-13

    Systems and methods are described for carrier-frequency synchronization for improved AM and TV broadcast reception. A method includes synchronizing a carrier frequency of a broadcast signal with a remote reference frequency. An apparatus includes a reference signal receiver; a phase comparator coupled to the reference signal receiver; a voltage controlled oscillator coupled to the phase comparator; and a radio frequency output coupled to the voltage controlled oscillator.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Stephen F; Moore, James A

    Systems and methods are described for carrier phase synchronization for improved AM and TV broadcast reception. A method includes synchronizing the phase of a carrier frequency of a broadcast signal with the phase of a remote reference frequency. An apparatus includes a receiver to detect the phase of a reference signal; a phase comparator coupled to the reference signal-phase receiver; a voltage controlled oscillator coupled to the phase comparator; and a phase-controlled radio frequency output coupled to the voltage controlled oscillator.

  15. Semiquantitative determination of mesophilic, aerobic microorganisms in cocoa products using the Soleris NF-TVC method.

    PubMed

    Montei, Carolyn; McDougal, Susan; Mozola, Mark; Rice, Jennifer

    2014-01-01

    The Soleris Non-fermenting Total Viable Count method was previously validated for a wide variety of food products, including cocoa powder. A matrix extension study was conducted to validate the method for use with cocoa butter and cocoa liquor. Test samples included naturally contaminated cocoa liquor and cocoa butter inoculated with natural microbial flora derived from cocoa liquor. A probability of detection statistical model was used to compare Soleris results at multiple test thresholds (dilutions) with aerobic plate counts determined using the AOAC Official Method 966.23 dilution plating method. Results of the two methods were not statistically different at any dilution level in any of the three trials conducted. The Soleris method offers the advantage of results within 24 h, compared to the 48 h required by standard dilution plating methods.

  16. Data Analysis and Graphing in an Introductory Physics Laboratory: Spreadsheet versus Statistics Suite

    ERIC Educational Resources Information Center

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…

  17. Recruiting Adolescent Research Participants: In-Person Compared to Social Media Approaches.

    PubMed

    Moreno, Megan A; Waite, Alan; Pumper, Megan; Colburn, Trina; Holm, Matt; Mendoza, Jason

    2017-01-01

    Recruiting adolescent participants for research is challenging. The purpose of this study was to compare traditional in-person recruitment methods to social media recruitment. We recruited adolescents aged 14-18 years for a pilot physical activity intervention study, including a wearable physical activity tracking device and a Facebook group. Participants were recruited (a) in person from a local high school and an adolescent medicine clinic and (b) through social media, including Facebook targeted ads, sponsored tweets on Twitter, and a blog post. Data collected included total exposure (i.e., reach), engagement (i.e., interaction), and effectiveness. Effectiveness included screening and enrollment for each recruitment method, as well as time and resources spent on each recruitment method. In-person recruitment reached a total of 297 potential participants of which 37 enrolled in the study. Social media recruitment reached a total of 34,272 potential participants of which 8 enrolled in the study. Social media recruitment methods utilized an average of 1.6 hours of staff time and cost an average of $40.99 per participant enrolled, while in-person recruitment methods utilized an average of 0.75 hours of staff time and cost an average of $19.09 per participant enrolled. Social media recruitment reached more potential participants, but the cost per participant enrolled was higher compared to traditional methods. Studies need to consider benefits and downsides of traditional and social media recruitment methods based on study goals and population.

  18. Seismic data fusion anomaly detection

    NASA Astrophysics Data System (ADS)

    Harrity, Kyle; Blasch, Erik; Alford, Mark; Ezekiel, Soundararajan; Ferris, David

    2014-06-01

    Detecting anomalies in non-stationary signals has valuable applications in many fields including medicine and meteorology. These include uses such as identifying possible heart conditions from an Electrocardiography (ECG) signals or predicting earthquakes via seismographic data. Over the many choices of anomaly detection algorithms, it is important to compare possible methods. In this paper, we examine and compare two approaches to anomaly detection and see how data fusion methods may improve performance. The first approach involves using an artificial neural network (ANN) to detect anomalies in a wavelet de-noised signal. The other method uses a perspective neural network (PNN) to analyze an arbitrary number of "perspectives" or transformations of the observed signal for anomalies. Possible perspectives may include wavelet de-noising, Fourier transform, peak-filtering, etc.. In order to evaluate these techniques via signal fusion metrics, we must apply signal preprocessing techniques such as de-noising methods to the original signal and then use a neural network to find anomalies in the generated signal. From this secondary result it is possible to use data fusion techniques that can be evaluated via existing data fusion metrics for single and multiple perspectives. The result will show which anomaly detection method, according to the metrics, is better suited overall for anomaly detection applications. The method used in this study could be applied to compare other signal processing algorithms.

  19. Carrier phase synchronization system for improved amplitude modulation and television broadcast reception

    DOEpatents

    Smith, Stephen F [Loudon, TN; Moore, James A [Powell, TN

    2011-02-01

    Systems and methods are described for carrier phase synchronization for improved AM and TV broadcast reception. A method includes synchronizing the phase of a carrier frequency of a broadcast signal with the phase of a remote reference frequency. An apparatus includes a receiver to detect the phase of a reference signal; a phase comparator coupled to the reference signal-phase receiver; a voltage controlled oscillator coupled to the phase comparator; and a phase-controlled radio frequency output coupled to the voltage controlled oscillator.

  20. A Comparison of Trajectory Optimization Methods for the Impulsive Minimum Fuel Rendezvous Problem

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Mailhe, Laurie M.; Guzman, Jose J.

    2002-01-01

    In this paper we present a comparison of optimization approaches to the minimum fuel rendezvous problem. Both indirect and direct methods are compared for a variety of test cases. The indirect approach is based on primer vector theory. The direct approaches are implemented numerically and include Sequential Quadratic Programming (SQP), Quasi-Newton, Simplex, Genetic Algorithms, and Simulated Annealing. Each method is applied to a variety of test cases including, circular to circular coplanar orbits, LEO to GEO, and orbit phasing in highly elliptic orbits. We also compare different constrained optimization routines on complex orbit rendezvous problems with complicated, highly nonlinear constraints.

  1. Comparison of Marine Spatial Planning Methods in Madagascar Demonstrates Value of Alternative Approaches

    PubMed Central

    Allnutt, Thomas F.; McClanahan, Timothy R.; Andréfouët, Serge; Baker, Merrill; Lagabrielle, Erwann; McClennen, Caleb; Rakotomanjaka, Andry J. M.; Tianarisoa, Tantely F.; Watson, Reg; Kremen, Claire

    2012-01-01

    The Government of Madagascar plans to increase marine protected area coverage by over one million hectares. To assist this process, we compare four methods for marine spatial planning of Madagascar's west coast. Input data for each method was drawn from the same variables: fishing pressure, exposure to climate change, and biodiversity (habitats, species distributions, biological richness, and biodiversity value). The first method compares visual color classifications of primary variables, the second uses binary combinations of these variables to produce a categorical classification of management actions, the third is a target-based optimization using Marxan, and the fourth is conservation ranking with Zonation. We present results from each method, and compare the latter three approaches for spatial coverage, biodiversity representation, fishing cost and persistence probability. All results included large areas in the north, central, and southern parts of western Madagascar. Achieving 30% representation targets with Marxan required twice the fish catch loss than the categorical method. The categorical classification and Zonation do not consider targets for conservation features. However, when we reduced Marxan targets to 16.3%, matching the representation level of the “strict protection” class of the categorical result, the methods show similar catch losses. The management category portfolio has complete coverage, and presents several management recommendations including strict protection. Zonation produces rapid conservation rankings across large, diverse datasets. Marxan is useful for identifying strict protected areas that meet representation targets, and minimize exposure probabilities for conservation features at low economic cost. We show that methods based on Zonation and a simple combination of variables can produce results comparable to Marxan for species representation and catch losses, demonstrating the value of comparing alternative approaches during initial stages of the planning process. Choosing an appropriate approach ultimately depends on scientific and political factors including representation targets, likelihood of adoption, and persistence goals. PMID:22359534

  2. Comparison of marine spatial planning methods in Madagascar demonstrates value of alternative approaches.

    PubMed

    Allnutt, Thomas F; McClanahan, Timothy R; Andréfouët, Serge; Baker, Merrill; Lagabrielle, Erwann; McClennen, Caleb; Rakotomanjaka, Andry J M; Tianarisoa, Tantely F; Watson, Reg; Kremen, Claire

    2012-01-01

    The Government of Madagascar plans to increase marine protected area coverage by over one million hectares. To assist this process, we compare four methods for marine spatial planning of Madagascar's west coast. Input data for each method was drawn from the same variables: fishing pressure, exposure to climate change, and biodiversity (habitats, species distributions, biological richness, and biodiversity value). The first method compares visual color classifications of primary variables, the second uses binary combinations of these variables to produce a categorical classification of management actions, the third is a target-based optimization using Marxan, and the fourth is conservation ranking with Zonation. We present results from each method, and compare the latter three approaches for spatial coverage, biodiversity representation, fishing cost and persistence probability. All results included large areas in the north, central, and southern parts of western Madagascar. Achieving 30% representation targets with Marxan required twice the fish catch loss than the categorical method. The categorical classification and Zonation do not consider targets for conservation features. However, when we reduced Marxan targets to 16.3%, matching the representation level of the "strict protection" class of the categorical result, the methods show similar catch losses. The management category portfolio has complete coverage, and presents several management recommendations including strict protection. Zonation produces rapid conservation rankings across large, diverse datasets. Marxan is useful for identifying strict protected areas that meet representation targets, and minimize exposure probabilities for conservation features at low economic cost. We show that methods based on Zonation and a simple combination of variables can produce results comparable to Marxan for species representation and catch losses, demonstrating the value of comparing alternative approaches during initial stages of the planning process. Choosing an appropriate approach ultimately depends on scientific and political factors including representation targets, likelihood of adoption, and persistence goals.

  3. Three methods for integration of environmental risk into the benefit-risk assessment of veterinary medicinal products.

    PubMed

    Chapman, Jennifer L; Porsch, Lucas; Vidaurre, Rodrigo; Backhaus, Thomas; Sinclair, Chris; Jones, Glyn; Boxall, Alistair B A

    2017-12-15

    Veterinary medicinal products (VMPs) require, as part of the European Union (EU) authorization process, consideration of both risks and benefits. Uses of VMPs have multiple risks (e.g., risks to the animal being treated, to the person administering the VMP) including risks to the environment. Environmental risks are not directly comparable to therapeutic benefits; there is no standardized approach to compare both environmental risks and therapeutic benefits. We have developed three methods for communicating and comparing therapeutic benefits and environmental risks for the benefit-risk assessment that supports the EU authorization process. Two of these methods support independent product evaluation (i.e., a summative classification and a visual scoring matrix classification); the other supports a comparative evaluation between alternative products (i.e., a comparative classification). The methods and the challenges to implementing a benefit-risk assessment including environmental risk are presented herein; how these concepts would work in current policy is discussed. Adaptability to scientific and policy development is considered. This work is an initial step in the development of a standardized methodology for integrated decision-making for VMPs. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. A Comparative Analysis of Method Books for Class Jazz Instruction

    ERIC Educational Resources Information Center

    Watson, Kevin E.

    2017-01-01

    The purpose of this study was to analyze and compare instructional topics and teaching approaches included in selected class method books for jazz pedagogy through content analysis methodology. Frequency counts for the number of pages devoted to each defined instructional content category were compiled and percentages of pages allotted to each…

  5. Statistical methods for detecting and comparing periodic data and their application to the nycthemeral rhythm of bodily harm: A population based study

    PubMed Central

    2010-01-01

    Background Animals, including humans, exhibit a variety of biological rhythms. This article describes a method for the detection and simultaneous comparison of multiple nycthemeral rhythms. Methods A statistical method for detecting periodic patterns in time-related data via harmonic regression is described. The method is particularly capable of detecting nycthemeral rhythms in medical data. Additionally a method for simultaneously comparing two or more periodic patterns is described, which derives from the analysis of variance (ANOVA). This method statistically confirms or rejects equality of periodic patterns. Mathematical descriptions of the detecting method and the comparing method are displayed. Results Nycthemeral rhythms of incidents of bodily harm in Middle Franconia are analyzed in order to demonstrate both methods. Every day of the week showed a significant nycthemeral rhythm of bodily harm. These seven patterns of the week were compared to each other revealing only two different nycthemeral rhythms, one for Friday and Saturday and one for the other weekdays. PMID:21059197

  6. A Comparison of Two Methods for Boolean Query Relevancy Feedback.

    ERIC Educational Resources Information Center

    Salton, G.; And Others

    1984-01-01

    Evaluates and compares two recently proposed automatic methods for relevance feedback of Boolean queries (Dillon method, which uses probabilistic approach as basis, and disjunctive normal form method). Conclusions are drawn concerning the use of effective feedback methods in a Boolean query environment. Nineteen references are included. (EJS)

  7. Missing portion sizes in FFQ--alternatives to use of standard portions.

    PubMed

    Køster-Rasmussen, Rasmus; Siersma, Volkert; Halldorsson, Thorhallur I; de Fine Olivarius, Niels; Henriksen, Jan E; Heitmann, Berit L

    2015-08-01

    Standard portions or substitution of missing portion sizes with medians may generate bias when quantifying the dietary intake from FFQ. The present study compared four different methods to include portion sizes in FFQ. We evaluated three stochastic methods for imputation of portion sizes based on information about anthropometry, sex, physical activity and age. Energy intakes computed with standard portion sizes, defined as sex-specific medians (median), or with portion sizes estimated with multinomial logistic regression (MLR), 'comparable categories' (Coca) or k-nearest neighbours (KNN) were compared with a reference based on self-reported portion sizes (quantified by a photographic food atlas embedded in the FFQ). The Danish Health Examination Survey 2007-2008. The study included 3728 adults with complete portion size data. Compared with the reference, the root-mean-square errors of the mean daily total energy intake (in kJ) computed with portion sizes estimated by the four methods were (men; women): median (1118; 1061), MLR (1060; 1051), Coca (1230; 1146), KNN (1281; 1181). The equivalent biases (mean error) were (in kJ): median (579; 469), MLR (248; 178), Coca (234; 188), KNN (-340; 218). The methods MLR and Coca provided the best agreement with the reference. The stochastic methods allowed for estimation of meaningful portion sizes by conditioning on information about physiology and they were suitable for multiple imputation. We propose to use MLR or Coca to substitute missing portion size values or when portion sizes needs to be included in FFQ without portion size data.

  8. Method and apparatus for ceramic analysis

    DOEpatents

    Jankowiak, Ryszard J.; Schilling, Chris; Small, Gerald J.; Tomasik, Piotr

    2003-04-01

    The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.

  9. Speaker verification system using acoustic data and non-acoustic data

    DOEpatents

    Gable, Todd J [Walnut Creek, CA; Ng, Lawrence C [Danville, CA; Holzrichter, John F [Berkeley, CA; Burnett, Greg C [Livermore, CA

    2006-03-21

    A method and system for speech characterization. One embodiment includes a method for speaker verification which includes collecting data from a speaker, wherein the data comprises acoustic data and non-acoustic data. The data is used to generate a template that includes a first set of "template" parameters. The method further includes receiving a real-time identity claim from a claimant, and using acoustic data and non-acoustic data from the identity claim to generate a second set of parameters. The method further includes comparing the first set of parameters to the set of parameters to determine whether the claimant is the speaker. The first set of parameters and the second set of parameters include at least one purely non-acoustic parameter, including a non-acoustic glottal shape parameter derived from averaging multiple glottal cycle waveforms.

  10. Compare Complication of Classic versus Patent Hemostasis in Transradial Coronary Angiography

    PubMed Central

    Roghani, Farshad; Tajik, Mohammad Nasim; Khosravi, Alireza

    2017-01-01

    Background: Coronary artery disease (CAD) is multifactorial disease, in which thrombotic occlusion and calcification occur usually. New strategies have been made for diagnosis and treatment of CAD, such as transradial catheterization. Hemostasis could be done in two approaches: traditional and patent. Our aim is to find the best approach with lowest complication. Materials and Methods: In a comparative study, 120 patients were recruited and divided randomly into two subgroups, including traditional group (60 patients; 24 females, 36 males; mean age: 64.35 ± 10.56 years) and patent group (60 patients; 28 females, 32 males; mean age: 60.15 ± 8.92 years). All demographic data including age, gender, body mass index, and CAD-related risk factors (smoking, diabetes, hypertension) and technical data including the number of catheters, procedure duration, and hemostatic compression time and clinical outcomes (radial artery occlusion [RAO], hematoma, bleeding) were collected. Data were analyzed by SPSS version 16. Results: Our findings revealed that the incidence of RAO was significantly lower in patent groups compared with traditional group (P = 0.041). Furthermore, the difference incidence of RAO was higher in early occlusion compare with late one (P = 0.041). Moreover, there were significant relationship between some factors in patients of traditional group with occlusion (gender [P = 0.038], age [P = 0.031], diabetes mellitus [P = 0.043], hemostatic compression time [P = 0.036]) as well as in patent group (age [P = 0.009], hypertension [P = 0.035]). Conclusion: Our findings showed that RAO, especially type early is significantly lower in patent method compared classic method; and patent hemostasis is the safest method and good alternative for classical method. PMID:29387670

  11. Methods for comparing 3D surface attributes

    NASA Astrophysics Data System (ADS)

    Pang, Alex; Freeman, Adam

    1996-03-01

    A common task in data analysis is to compare two or more sets of data, statistics, presentations, etc. A predominant method in use is side-by-side visual comparison of images. While straightforward, it burdens the user with the task of discerning the differences between the two images. The user if further taxed when the images are of 3D scenes. This paper presents several methods for analyzing the extent, magnitude, and manner in which surfaces in 3D differ in their attributes. The surface geometry are assumed to be identical and only the surface attributes (color, texture, etc.) are variable. As a case in point, we examine the differences obtained when a 3D scene is rendered progressively using radiosity with different form factor calculation methods. The comparison methods include extensions of simple methods such as mapping difference information to color or transparency, and more recent methods including the use of surface texture, perturbation, and adaptive placements of error glyphs.

  12. Noncontact spirometry with a webcam

    NASA Astrophysics Data System (ADS)

    Liu, Chenbin; Yang, Yuting; Tsow, Francis; Shao, Dangdang; Tao, Nongjian

    2017-05-01

    We present an imaging-based method for noncontact spirometry. The method tracks the subtle respiratory-induced shoulder movement of a subject, builds a calibration curve, and determines the flow-volume spirometry curve and vital respiratory parameters, including forced expiratory volume in the first second, forced vital capacity, and peak expiratory flow rate. We validate the accuracy of the method by comparing the data with those simultaneously recorded with a gold standard reference method and examine the reliability of the noncontact spirometry with a pilot study including 16 subjects. This work demonstrates that the noncontact method can provide accurate and reliable spirometry tests with a webcam. Compared to the traditional spirometers, the present noncontact spirometry does not require using a spirometer, breathing into a mouthpiece, or wearing a nose clip, thus making spirometry test more easily accessible for the growing population of asthma and chronic obstructive pulmonary diseases.

  13. Noncontact spirometry with a webcam.

    PubMed

    Liu, Chenbin; Yang, Yuting; Tsow, Francis; Shao, Dangdang; Tao, Nongjian

    2017-05-01

    We present an imaging-based method for noncontact spirometry. The method tracks the subtle respiratory-induced shoulder movement of a subject, builds a calibration curve, and determines the flow-volume spirometry curve and vital respiratory parameters, including forced expiratory volume in the first second, forced vital capacity, and peak expiratory flow rate. We validate the accuracy of the method by comparing the data with those simultaneously recorded with a gold standard reference method and examine the reliability of the noncontact spirometry with a pilot study including 16 subjects. This work demonstrates that the noncontact method can provide accurate and reliable spirometry tests with a webcam. Compared to the traditional spirometers, the present noncontact spirometry does not require using a spirometer, breathing into a mouthpiece, or wearing a nose clip, thus making spirometry test more easily accessible for the growing population of asthma and chronic obstructive pulmonary diseases.

  14. VIDAS Listeria species Xpress (LSX).

    PubMed

    Johnson, Ronald; Mills, John

    2013-01-01

    The AOAC GovVal study compared the VIDAS Listeria species Xpress (LSX) to the Health Products and Food Branch MFHPB-30 reference method for detection of Listeria on stainless steel. The LSX method utilizes a novel and proprietary enrichment media, Listeria Xpress broth, enabling detection of Listeria species in environmental samples with the automated VIDAS in a minimum of 26 h. The LSX method also includes the use of the chromogenic media, chromID Ottaviani Agosti Agar (OAA) and chromID Lmono for confirmation of LSX presumptive results. In previous AOAC validation studies comparing VIDAS LSX to the U.S. Food and Drug Administration's Bacteriological Analytical Manual (FDA-BAM) and the U.S. Department of Agriculture-Food Safety and Inspection Service (USDA-FSIS) reference methods, the LSX method was approved as AOAC Official Method 2010.02 for the detection of Listeria species in dairy products, vegetables, seafood, raw meats and poultry, and processed meats and poultry, and as AOAC Performance Tested Method 100501 in a variety of foods and on environmental surfaces. The GovVal comparative study included 20 replicate test portions each at two contamination levels for stainless steel where fractionally positive results (5-15 positive results/20 replicate portions tested) were obtained by at least one method at one level. Five uncontaminated controls were included. In the stainless steel artificially contaminated surface study, there were 25 confirmed positives by the VIDAS LSX assay and 22 confirmed positives by the standard culture methods. Chi-square analysis indicated no statistical differences between the VIDAS LSX method and the MFHPB-30 standard methods at the 5% level of significance. Confirmation of presumptive LSX results with the chromogenic OAA and Lmono media was shown to be equivalent to the appropriate reference method agars. The data in this study demonstrate that the VIDAS LSX method is an acceptable alternative method to the MFHPB-30 standard culture method for the detection of Listeria species on stainless steel.

  15. System and method for evaluating a wire conductor

    DOEpatents

    Panozzo, Edward; Parish, Harold

    2013-10-22

    A method of evaluating an electrically conductive wire segment having an insulated intermediate portion and non-insulated ends includes passing the insulated portion of the wire segment through an electrically conductive brush. According to the method, an electrical potential is established on the brush by a power source. The method also includes determining a value of electrical current that is conducted through the wire segment by the brush when the potential is established on the brush. The method additionally includes comparing the value of electrical current conducted through the wire segment with a predetermined current value to thereby evaluate the wire segment. A system for evaluating an electrically conductive wire segment is also disclosed.

  16. Effectiveness and efficacy of minimally invasive lung volume reduction surgery for emphysema

    PubMed Central

    Pertl, Daniela; Eisenmann, Alexander; Holzer, Ulrike; Renner, Anna-Theresa; Valipour, A.

    2014-01-01

    Lung emphysema is a chronic, progressive and irreversible destruction of the lung tissue. Besides non-medical therapies and the well established medical treatment there are surgical and minimally invasive methods for lung volume reduction (LVR) to treat severe emphysema. This report deals with the effectiveness and cost-effectiveness of minimally invasive methods compared to other treatments for LVR in patients with lung emphysema. Furthermore, legal and ethical aspects are discussed. No clear benefit of minimally invasive methods compared to surgical methods can be demonstrated based on the identified and included evidence. In order to assess the different methods for LVR regarding their relative effectiveness and safety in patients with lung emphysema direct comparative studies are necessary. PMID:25295123

  17. Effectiveness and efficacy of minimally invasive lung volume reduction surgery for emphysema.

    PubMed

    Pertl, Daniela; Eisenmann, Alexander; Holzer, Ulrike; Renner, Anna-Theresa; Valipour, A

    2014-01-01

    Lung emphysema is a chronic, progressive and irreversible destruction of the lung tissue. Besides non-medical therapies and the well established medical treatment there are surgical and minimally invasive methods for lung volume reduction (LVR) to treat severe emphysema. This report deals with the effectiveness and cost-effectiveness of minimally invasive methods compared to other treatments for LVR in patients with lung emphysema. Furthermore, legal and ethical aspects are discussed. No clear benefit of minimally invasive methods compared to surgical methods can be demonstrated based on the identified and included evidence. In order to assess the different methods for LVR regarding their relative effectiveness and safety in patients with lung emphysema direct comparative studies are necessary.

  18. An Evaluation of the Efficiency of Different Hygienisation Methods

    NASA Astrophysics Data System (ADS)

    Zrubková, M.

    2017-10-01

    The aim of this study is to evaluate the efficiency of hygienisation by pasteurisation, temperature-phased anaerobic digestion and sludge liming. A summary of the legislation concerning sludge treatment, disposal and recycling is included. The hygienisation methods are compared not only in terms of hygienisation efficiency but a comparison of other criteria is also included.

  19. System for testing properties of a network

    DOEpatents

    Rawle, Michael; Bartholomew, David B.; Soares, Marshall A.

    2009-06-16

    A method for identifying properties of a downhole electromagnetic network in a downhole tool sting, including the step of providing an electromagnetic path intermediate a first location and a second location on the electromagnetic network. The method further includes the step of providing a receiver at the second location. The receiver includes a known reference. The analog signal includes a set amplitude, a set range of frequencies, and a set rate of change between the frequencies. The method further includes the steps of sending the analog signal, and passively modifying the signal. The analog signal is sent from the first location through the electromagnetic path, and the signal is modified by the properties of the electromagnetic path. The method further includes the step of receiving a modified signal at the second location and comparing the known reference to the modified signal.

  20. Compare Complication of Classic versus Patent Hemostasis in Transradial Coronary Angiography.

    PubMed

    Roghani, Farshad; Tajik, Mohammad Nasim; Khosravi, Alireza

    2017-01-01

    Coronary artery disease (CAD) is multifactorial disease, in which thrombotic occlusion and calcification occur usually. New strategies have been made for diagnosis and treatment of CAD, such as transradial catheterization. Hemostasis could be done in two approaches: traditional and patent. Our aim is to find the best approach with lowest complication. In a comparative study, 120 patients were recruited and divided randomly into two subgroups, including traditional group (60 patients; 24 females, 36 males; mean age: 64.35 ± 10.56 years) and patent group (60 patients; 28 females, 32 males; mean age: 60.15 ± 8.92 years). All demographic data including age, gender, body mass index, and CAD-related risk factors (smoking, diabetes, hypertension) and technical data including the number of catheters, procedure duration, and hemostatic compression time and clinical outcomes (radial artery occlusion [RAO], hematoma, bleeding) were collected. Data were analyzed by SPSS version 16. Our findings revealed that the incidence of RAO was significantly lower in patent groups compared with traditional group ( P = 0.041). Furthermore, the difference incidence of RAO was higher in early occlusion compare with late one ( P = 0.041). Moreover, there were significant relationship between some factors in patients of traditional group with occlusion (gender [ P = 0.038], age [ P = 0.031], diabetes mellitus [ P = 0.043], hemostatic compression time [ P = 0.036]) as well as in patent group (age [ P = 0.009], hypertension [ P = 0.035]). Our findings showed that RAO, especially type early is significantly lower in patent method compared classic method; and patent hemostasis is the safest method and good alternative for classical method.

  1. Construction of Response Surface with Higher Order Continuity and Its Application to Reliability Engineering

    NASA Technical Reports Server (NTRS)

    Krishnamurthy, T.; Romero, V. J.

    2002-01-01

    The usefulness of piecewise polynomials with C1 and C2 derivative continuity for response surface construction method is examined. A Moving Least Squares (MLS) method is developed and compared with four other interpolation methods, including kriging. First the selected methods are applied and compared with one another in a two-design variables problem with a known theoretical response function. Next the methods are tested in a four-design variables problem from a reliability-based design application. In general the piecewise polynomial with higher order derivative continuity methods produce less error in the response prediction. The MLS method was found to be superior for response surface construction among the methods evaluated.

  2. Analysis of digital communication signals and extraction of parameters

    NASA Astrophysics Data System (ADS)

    Al-Jowder, Anwar

    1994-12-01

    The signal classification performance of four types of electronics support measure (ESM) communications detection systems is compared from the standpoint of the unintended receiver (interceptor). Typical digital communication signals considered include binary phase shift keying (BPSK), quadrature phase shift keying (QPSK), frequency shift keying (FSK), and on-off keying (OOK). The analysis emphasizes the use of available signal processing software. Detection methods compared include broadband energy detection, FFT-based narrowband energy detection, and two correlation methods which employ the fast Fourier transform (FFT). The correlation methods utilize modified time-frequency distributions, where one of these is based on the Wigner-Ville distribution (WVD). Gaussian white noise is added to the signal to simulate various signal-to-noise ratios (SNR's).

  3. Modified carbohydrate-chitosan compounds, methods of making the same and methods of using the same

    DOEpatents

    Venditti, Richard A; Pawlak, Joel J; Salam, Abdus; El-Tahlawy, Khaled Fathy

    2015-03-10

    Compositions of matter are provided that include chitosan and a modified carbohydrate. The modified carbohydrate includes a carbohydrate component and a cross linking agent. The modified carbohydrate has increased carboxyl content as compared to an unmodified counterpart carbohydrate. A carboxyl group of the modified carbohydrate is covalently bonded with an amino group of chitosan. The compositions of matter provided herein may include cross linked starch citrate-chitosan and cross linked hemicellulose citrate-chitosan, including foams thereof. These compositions yield excellent absorbency and metal chelation properties. Methods of making cross linked modified carbohydrate-chitosan compounds are also provided.

  4. Hypothesis Testing Using Factor Score Regression: A Comparison of Four Methods

    ERIC Educational Resources Information Center

    Devlieger, Ines; Mayer, Axel; Rosseel, Yves

    2016-01-01

    In this article, an overview is given of four methods to perform factor score regression (FSR), namely regression FSR, Bartlett FSR, the bias avoiding method of Skrondal and Laake, and the bias correcting method of Croon. The bias correcting method is extended to include a reliable standard error. The four methods are compared with each other and…

  5. The combination of the error correction methods of GAFCHROMIC EBT3 film

    PubMed Central

    Li, Yinghui; Chen, Lixin; Zhu, Jinhan; Liu, Xiaowei

    2017-01-01

    Purpose The aim of this study was to combine a set of methods for use of radiochromic film dosimetry, including calibration, correction for lateral effects and a proposed triple-channel analysis. These methods can be applied to GAFCHROMIC EBT3 film dosimetry for radiation field analysis and verification of IMRT plans. Methods A single-film exposure was used to achieve dose calibration, and the accuracy was verified based on comparisons with the square-field calibration method. Before performing the dose analysis, the lateral effects on pixel values were corrected. The position dependence of the lateral effect was fitted by a parabolic function, and the curvature factors of different dose levels were obtained using a quadratic formula. After lateral effect correction, a triple-channel analysis was used to reduce disturbances and convert scanned images from films into dose maps. The dose profiles of open fields were measured using EBT3 films and compared with the data obtained using an ionization chamber. Eighteen IMRT plans with different field sizes were measured and verified with EBT3 films, applying our methods, and compared to TPS dose maps, to check correct implementation of film dosimetry proposed here. Results The uncertainty of lateral effects can be reduced to ±1 cGy. Compared with the results of Micke A et al., the residual disturbances of the proposed triple-channel method at 48, 176 and 415 cGy are 5.3%, 20.9% and 31.4% smaller, respectively. Compared with the ionization chamber results, the difference in the off-axis ratio and percentage depth dose are within 1% and 2%, respectively. For the application of IMRT verification, there were no difference between two triple-channel methods. Compared with only corrected by triple-channel method, the IMRT results of the combined method (include lateral effect correction and our present triple-channel method) show a 2% improvement for large IMRT fields with the criteria 3%/3 mm. PMID:28750023

  6. SEMI-VOLATILE SECONDARY AEROSOLS IN URBAN ATMOSPHERES: MEETING A MEASURED CHALLENGE

    EPA Science Inventory

    This presentation compares the results from various particle measurement methods as they relate to semi-volatile secondary aerosols in urban atmospheres. The methods include the PM2.5 Federal Reference Method; Particle Concentrator - BYU Organic Sampling System (PC-BOSS); the Re...

  7. Comparing Multiple Evapotranspiration-calculating Methods, Including Eddy Covariance and Surface Renewal, Using Empirical Measurements from Alfalfa Fields in the Sacramento-San Joaquin River Delta

    NASA Astrophysics Data System (ADS)

    Clay, J.; Kent, E. R.; Leinfelder-Miles, M.; Lambert, J. J.; Little, C.; Paw U, K. T.; Snyder, R. L.

    2016-12-01

    Eddy covariance and surface renewal measurements were used to estimate evapotranspiration (ET) over a variety of crop fields in the Sacramento-San Joaquin River Delta during the 2016 growing season. However, comparing and evaluating multiple measurement systems and methods for determining ET was focused upon at a single alfalfa site. The eddy covariance systems included two systems for direct measurement of latent heat flux: one using a separate sonic anemometer and an open path infrared gas analyzer and another using a combined system (Campbell Scientific IRGASON). For these methods, eddy covariance was used with measurements from the Campbell Scientific CSAT3, the LI-COR 7500a, the Campbell Scientific IRGASON, and an additional R.M. Young sonic anemometer. In addition to those direct measures, the surface renewal approach included several energy balance residual methods in which net radiation, ground heat flux, and sensible heat flux (H) were measured. H was measured using several systems and different methods, including using multiple fast-response thermocouple measurements and using the temperatures measured by the sonic anemometers. The energy available for ET was then calculated as the residual of the surface energy balance equation. Differences in ET values were analyzed between the eddy covariance and surface renewal methods, using the IRGASON-derived values of ET as the standard for accuracy.

  8. Comparative study of signalling methods for high-speed backplane transceiver

    NASA Astrophysics Data System (ADS)

    Wu, Kejun

    2017-11-01

    A combined analysis of transient simulation and statistical method is proposed for comparative study of signalling methods applied to high-speed backplane transceivers. This method enables fast and accurate signal-to-noise ratio and symbol error rate estimation of a serial link based on a four-dimension design space, including channel characteristics, noise scenarios, equalisation schemes, and signalling methods. The proposed combined analysis method chooses an efficient sampling size for performance evaluation. A comparative study of non-return-to-zero (NRZ), PAM-4, and four-phase shifted sinusoid symbol (PSS-4) using parameterised behaviour-level simulation shows PAM-4 and PSS-4 has substantial advantages over conventional NRZ in most of the cases. A comparison between PAM-4 and PSS-4 shows PAM-4 gets significant bit error rate degradation when noise level is enhanced.

  9. Comparison of Influenza Virus Particle Purification Using Magnetic Sulfated Cellulose Particles with an Established Centrifugation Method for Analytics.

    PubMed

    Serve, Anja; Pieler, Michael Martin; Benndorf, Dirk; Rapp, Erdmann; Wolff, Michael Werner; Reichl, Udo

    2015-11-03

    A method for the purification of influenza virus particles using novel magnetic sulfated cellulose particles is presented and compared to an established centrifugation method for analytics. Therefore, purified influenza A virus particles from adherent and suspension MDCK host cell lines were characterized on the protein level with mass spectrometry to compare the viral and residual host cell proteins. Both methods allowed one to identify all 10 influenza A virus proteins, including low-abundance proteins like the matrix protein 2 and nonstructural protein 1, with a similar impurity level of host cell proteins. Compared to the centrifugation method, use of the novel magnetic sulfated cellulose particles reduced the influenza A virus particle purification time from 3.5 h to 30 min before mass spectrometry analysis.

  10. Wafer characteristics via reflectometry

    DOEpatents

    Sopori, Bhushan L.

    2010-10-19

    Various exemplary methods (800, 900, 1000, 1100) are directed to determining wafer thickness and/or wafer surface characteristics. An exemplary method (900) includes measuring reflectance of a wafer and comparing the measured reflectance to a calculated reflectance or a reflectance stored in a database. Another exemplary method (800) includes positioning a wafer on a reflecting support to extend a reflectance range. An exemplary device (200) has an input (210), analysis modules (222-228) and optionally a database (230). Various exemplary reflectometer chambers (1300, 1400) include radiation sources positioned at a first altitudinal angle (1308, 1408) and at a second altitudinal angle (1312, 1412). An exemplary method includes selecting radiation sources positioned at various altitudinal angles. An exemplary element (1650, 1850) includes a first aperture (1654, 1854) and a second aperture (1658, 1858) that can transmit reflected radiation to a fiber and an imager, respectfully.

  11. Comparative analysis of dynamic pricing strategies for managed lanes.

    DOT National Transportation Integrated Search

    2015-06-01

    The objective of this research is to investigate and compare the performances of different : dynamic pricing strategies for managed lanes facilities. These pricing strategies include real-time : traffic responsive methods, as well as refund options a...

  12. Improving the performances of autofocus based on adaptive retina-like sampling model

    NASA Astrophysics Data System (ADS)

    Hao, Qun; Xiao, Yuqing; Cao, Jie; Cheng, Yang; Sun, Ce

    2018-03-01

    An adaptive retina-like sampling model (ARSM) is proposed to balance autofocusing accuracy and efficiency. Based on the model, we carry out comparative experiments between the proposed method and the traditional method in terms of accuracy, the full width of the half maxima (FWHM) and time consumption. Results show that the performances of our method are better than that of the traditional method. Meanwhile, typical autofocus functions, including sum-modified-Laplacian (SML), Laplacian (LAP), Midfrequency-DCT (MDCT) and Absolute Tenengrad (ATEN) are compared through comparative experiments. The smallest FWHM is obtained by the use of LAP, which is more suitable for evaluating accuracy than other autofocus functions. The autofocus function of MDCT is most suitable to evaluate the real-time ability.

  13. A Comparison of Programed Instruction with Conventional Methods for Teaching Two Units of Eighth Grade Science.

    ERIC Educational Resources Information Center

    Eshleman, Winston Hull

    Compared were programed materials and conventional methods for teaching two units of eighth grade science. Programed materials used were linear programed books requiring constructed responses. The conventional methods included textbook study, written exercises, lectures, discussions, demonstrations, experiments, chalkboard drawings, films,…

  14. COMPARISON OF TWO DIFFERENT SOLID PHASE EXTRACTION/LARGE VOLUME INJECTION PROCEDURES FOR METHOD 8270

    EPA Science Inventory

    Two solid phase (SPE) and one traditional continuous liquid-liquid extraction method are compared for analysis of Method 8270 SVOCs. Productivity parameters include data quality, sample volume, analysis time and solvent waste.

    One SPE system, unique in the U.S., uses aut...

  15. The importance of quality control in validating concentrations of contaminants of emerging concern in source and treated drinking water samples.

    PubMed

    Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W

    2017-02-01

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.

  16. Effects of Earth's curvature in full-wave modeling of VLF propagation

    NASA Astrophysics Data System (ADS)

    Qiu, L.; Lehtinen, N. G.; Inan, U. S.; Stanford VLF Group

    2011-12-01

    We show how to include curvature in the full-wave finite element approach to calculate ELF/VLF wave propagation in horizontally stratified earth-ionosphere waveguide. A general curvilinear stratified system is considered, and the numerical solutions of full-wave method in curvilinear system are compared with the analytic solutions in the cylindrical and spherical waveguides filled with an isotropic medium. We calculate the attenuation and height gain for modes in the Earth-ionosphere waveguide, taking into account the anisotropicity of ionospheric plasma, for different assumptions about the Earth's curvature, and quantify the corrections due to the curvature. The results are compared with the results of previous models, such as LWPC, as well as with ground and satellite observations, and show improved accuracy compared with full-wave method without including the curvature effect.

  17. Methods for pretreating biomass

    DOEpatents

    Balan, Venkatesh; Dale, Bruce E; Chundawat, Shishir; Sousa, Leonardo

    2017-05-09

    A method for pretreating biomass is provided, which includes, in a reactor, allowing gaseous ammonia to condense on the biomass and react with water present in the biomass to produce pretreated biomass, wherein reactivity of polysaccharides in the biomass is increased during subsequent biological conversion as compared to the reactivity of polysaccharides in biomass which has not been pretreated. A method for pretreating biomass with a liquid ammonia and recovering the liquid ammonia is also provided. Related systems which include a biochemical or biofuel production facility are also disclosed.

  18. Comparison of Health Examination Survey Methods in Brazil, Chile, Colombia, Mexico, England, Scotland, and the United States.

    PubMed

    Mindell, Jennifer S; Moody, Alison; Vecino-Ortiz, Andres I; Alfaro, Tania; Frenz, Patricia; Scholes, Shaun; Gonzalez, Silvia A; Margozzini, Paula; de Oliveira, Cesar; Sanchez Romero, Luz Maria; Alvarado, Andres; Cabrera, Sebastián; Sarmiento, Olga L; Triana, Camilo A; Barquera, Simón

    2017-09-15

    Comparability of population surveys across countries is key to appraising trends in population health. Achieving this requires deep understanding of the methods used in these surveys to examine the extent to which the measurements are comparable. In this study, we obtained detailed protocols of 8 nationally representative surveys from 2007-2013 from Brazil, Chile, Colombia, Mexico, the United Kingdom (England and Scotland), and the United States-countries that that differ in economic and inequity indicators. Data were collected on sampling frame, sample selection procedures, recruitment, data collection methods, content of interview and examination modules, and measurement protocols. We also assessed their adherence to the World Health Organization's "STEPwise Approach to Surveillance" framework for population health surveys. The surveys, which included half a million participants, were highly comparable on sampling methodology, survey questions, and anthropometric measurements. Heterogeneity was found for physical activity questionnaires and biological samples collection. The common age range included by the surveys was adults aged 18-64 years. The methods used in these surveys were similar enough to enable comparative analyses of the data across the 7 countries. This comparability is crucial in assessing and comparing national and subgroup population health, and to assisting the transfer of research and policy knowledge across countries. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Recent developments in structural proteomics for protein structure determination.

    PubMed

    Liu, Hsuan-Liang; Hsu, Jyh-Ping

    2005-05-01

    The major challenges in structural proteomics include identifying all the proteins on the genome-wide scale, determining their structure-function relationships, and outlining the precise three-dimensional structures of the proteins. Protein structures are typically determined by experimental approaches such as X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. However, the knowledge of three-dimensional space by these techniques is still limited. Thus, computational methods such as comparative and de novo approaches and molecular dynamic simulations are intensively used as alternative tools to predict the three-dimensional structures and dynamic behavior of proteins. This review summarizes recent developments in structural proteomics for protein structure determination; including instrumental methods such as X-ray crystallography and NMR spectroscopy, and computational methods such as comparative and de novo structure prediction and molecular dynamics simulations.

  20. Limitations and Tolerances in Optical Devices

    NASA Astrophysics Data System (ADS)

    Jackman, Neil Allan

    The performance of optical systems is limited by the imperfections of their components. Many of the devices in optical systems including optical fiber amplifiers, multimode transmission lines and multilayered media such as mirrors, windows and filters, are modeled by coupled line equations. This investigation includes: (i) a study of the limitations imposed on a wavelength multiplexed unidirectional ring by the non-uniformities of the gain spectra of Erbium-doped optical fiber amplifiers. We find numerical solutions for non-linear coupled power differential equations and use these solutions to compare the signal -to-noise ratios and signal levels at different nodes. (ii) An analytical study of the tolerances of imperfect multimode media which support forward traveling modes. The complex mode amplitudes are related by linear coupled differential equations. We use analytical methods to derive extended equations for the expected mode powers and give heuristic limits for their regions of validity. These results compare favorably to exact solutions found for a special case. (iii) A study of the tolerances of multilayered media in the presence of optical thickness imperfections. We use analytical methods including Kronecker producers, to calculate the reflection and transmission statistics of the media. Monte Carlo simulations compare well to our analytical method.

  1. A Comparison of Computational Aeroacoustic Prediction Methods for Transonic Rotor Noise

    NASA Technical Reports Server (NTRS)

    Brentner, Kenneth S.; Lyrintzis, Anastasios; Koutsavdis, Evangelos K.

    1996-01-01

    This paper compares two methods for predicting transonic rotor noise for helicopters in hover and forward flight. Both methods rely on a computational fluid dynamics (CFD) solution as input to predict the acoustic near and far fields. For this work, the same full-potential rotor code has been used to compute the CFD solution for both acoustic methods. The first method employs the acoustic analogy as embodied in the Ffowcs Williams-Hawkings (FW-H) equation, including the quadrupole term. The second method uses a rotating Kirchhoff formulation. Computed results from both methods are compared with one other and with experimental data for both hover and advancing rotor cases. The results are quite good for all cases tested. The sensitivity of both methods to CFD grid resolution and to the choice of the integration surface/volume is investigated. The computational requirements of both methods are comparable; in both cases these requirements are much less than the requirements for the CFD solution.

  2. Criteria for Comparing Children's Web Search Tools.

    ERIC Educational Resources Information Center

    Kuntz, Jerry

    1999-01-01

    Presents criteria for evaluating and comparing Web search tools designed for children. Highlights include database size; accountability; categorization; search access methods; help files; spell check; URL searching; links to alternative search services; advertising; privacy policy; and layout and design. (LRW)

  3. Assessment of gene order computing methods for Alzheimer's disease

    PubMed Central

    2013-01-01

    Background Computational genomics of Alzheimer disease (AD), the most common form of senile dementia, is a nascent field in AD research. The field includes AD gene clustering by computing gene order which generates higher quality gene clustering patterns than most other clustering methods. However, there are few available gene order computing methods such as Genetic Algorithm (GA) and Ant Colony Optimization (ACO). Further, their performance in gene order computation using AD microarray data is not known. We thus set forth to evaluate the performances of current gene order computing methods with different distance formulas, and to identify additional features associated with gene order computation. Methods Using different distance formulas- Pearson distance and Euclidean distance, the squared Euclidean distance, and other conditions, gene orders were calculated by ACO and GA (including standard GA and improved GA) methods, respectively. The qualities of the gene orders were compared, and new features from the calculated gene orders were identified. Results Compared to the GA methods tested in this study, ACO fits the AD microarray data the best when calculating gene order. In addition, the following features were revealed: different distance formulas generated a different quality of gene order, and the commonly used Pearson distance was not the best distance formula when used with both GA and ACO methods for AD microarray data. Conclusion Compared with Pearson distance and Euclidean distance, the squared Euclidean distance generated the best quality gene order computed by GA and ACO methods. PMID:23369541

  4. Comparative study on diagonal equivalent methods of masonry infill panel

    NASA Astrophysics Data System (ADS)

    Amalia, Aniendhita Rizki; Iranata, Data

    2017-06-01

    Infrastructure construction in earthquake prone area needs good design process, including modeling a structure in a correct way to reduce damages caused by an earthquake. Earthquakes cause many damages e.g. collapsed buildings that are dangerous. An incorrect modeling in design process certainly affects the structure's ability in responding to load, i.e. an earthquake load, and it needs to be paid attention to in order to reduce damages and fatalities. A correct modeling considers every aspect that affects the strength of a building, including stiffness of resisting lateral loads caused by an earthquake. Most of structural analyses still use open frame method that does not consider the effect of stiffness of masonry panel to the stiffness and strength of the whole structure. Effect of masonry panel is usually not included in design process, but the presence of this panel greatly affects behavior of the building in responding to an earthquake. In worst case scenario, it can even cause the building to collapse as what has been reported after great earthquakes worldwide. Modeling a structure with masonry panel as consideration can be performed by designing the panel as compression brace or shell element. In designing masonry panel as a compression brace, there are fourteen methods popular to be used by structure designers formulated by Saneinejad-Hobbs, Holmes, Stafford-Smith, Mainstones, Mainstones-Weeks, Bazan-Meli, Liauw Kwan, Paulay and Priestley, FEMA 356, Durani Luo, Hendry, Al-Chaar, Papia and Chen-Iranata. Every method has its own equation and parameters to use, therefore the model of every method was compared to results of experimental test to see which one gives closer values. Moreover, those methods also need to be compared to the open frame to see if they can result values within limits. Experimental test that was used in comparing all methods was taken from Mehrabi's research (Fig. 1), which was a prototype of a frame in a structure with 0.5 scale and the ratio of height to width of 1 to 1.5. Load used in the experiment was based on Uniform Building Code (UBC) 1991. Every method compared was calculated first to get equivalent diagonal strut width. The second step was modelling method using structure analysis software as a frame with a diagonal in a linear mode. The linear mode was chosen based on structure analysis commonly used by structure designers. The frame was loaded and for every model, its load and deformation values were identified. The values of load - deformation of every method were compared to those of experimental test specimen by Mehrabi and open frame. From comparative study performed, Holmes' and Bazan-Meli's equations gave results the closest to the experimental test specimen by Mehrabi. Other equations that gave close values within the limit (by comparing it to the open frame) are Saneinejad-Hobbs, Stafford-Smith, Bazan-Meli, Liauw Kwan, Paulay and Priestley, FEMA 356, Durani Luo, Hendry, Papia and Chen-Iranata.

  5. Nitric oxide donors for cervical ripening and induction of labour.

    PubMed

    Kelly, Anthony J; Munson, Christopher; Minden, Lucy

    2011-06-15

    Sometimes it is necessary to bring on labour artificially because of safety concerns for the mother or baby. This review is one of a series of reviews of methods of labour induction using a standardised protocol.Induction of labour occurs in approximately 20% of pregnancies in the UK. The ideal agent for induction of labour would induce cervical ripening without causing uterine contractions. Currently most commonly used cervical ripening or induction agents result in uterine activity or contractions, or both. Cervical ripening without uterine contractility could occur safely in an outpatient setting and it may be expected that this would result in greater maternal satisfaction and lower costs. To determine the effects of nitric oxide (NO) donors for third trimester cervical ripening or induction of labour, in comparison with placebo or no treatment or other treatments from a predefined hierarchy. We searched the Cochrane Pregnancy and Childbirth Group's Trials Register (31 December 2010) and the reference lists of trial reports and reviews. Clinical trials comparing NO donors for cervical ripening or labour induction to other methods listed above it on a predefined list of methods of labour induction. The trials include some form of random allocation to either group; and report one or more of the prestated outcomes. NO donors (isosorbide mononitrate, nitroglycerin and sodium nitroprusside) are compared to other methods listed above it on a predefined list of methods of labour induction. This review is part of a series of reviews focusing on methods of induction of labour. Three review authors independently assessed trials for inclusion, assessed risk of bias and extracted data. We considered 19 trials; we included 10 (including a total of 1889 women) trials, excluded eight trials and one trial report is awaiting classification. Included studies compared NO donors to placebo, vaginal prostaglandin E2, intracervical PGE2 and vaginal misoprostol. All included studies were of a generally high standard with a low risk of bias.There are very limited data available to compare nitric oxide donors to any other induction agent. There is no evidence of any difference between any of the prespecified outcomes when comparing NO donors to other induction agents, with the exception of an increase in maternal side effects. NO donors do not appear currently to be a useful tool in the process of induction of labour. More studies are required to examine how NO donors may work alongside established induction of labour protocols, especially those based in outpatient settings.

  6. Heap/stack guard pages using a wakeup unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gooding, Thomas M; Satterfield, David L; Steinmacher-Burow, Burkhard

    A method and system for providing a memory access check on a processor including the steps of detecting accesses to a memory device including level-1 cache using a wakeup unit. The method includes invalidating level-1 cache ranges corresponding to a guard page, and configuring a plurality of wakeup address compare (WAC) registers to allow access to selected WAC registers. The method selects one of the plurality of WAC registers, and sets up a WAC register related to the guard page. The method configures the wakeup unit to interrupt on access of the selected WAC register. The method detects access ofmore » the memory device using the wakeup unit when a guard page is violated. The method generates an interrupt to the core using the wakeup unit, and determines the source of the interrupt. The method detects the activated WAC registers assigned to the violated guard page, and initiates a response.« less

  7. Genomic-Enabled Prediction Kernel Models with Random Intercepts for Multi-environment Trials.

    PubMed

    Cuevas, Jaime; Granato, Italo; Fritsche-Neto, Roberto; Montesinos-Lopez, Osval A; Burgueño, Juan; Bandeira E Sousa, Massaine; Crossa, José

    2018-03-28

    In this study, we compared the prediction accuracy of the main genotypic effect model (MM) without G×E interactions, the multi-environment single variance G×E deviation model (MDs), and the multi-environment environment-specific variance G×E deviation model (MDe) where the random genetic effects of the lines are modeled with the markers (or pedigree). With the objective of further modeling the genetic residual of the lines, we incorporated the random intercepts of the lines ([Formula: see text]) and generated another three models. Each of these 6 models were fitted with a linear kernel method (Genomic Best Linear Unbiased Predictor, GB) and a Gaussian Kernel (GK) method. We compared these 12 model-method combinations with another two multi-environment G×E interactions models with unstructured variance-covariances (MUC) using GB and GK kernels (4 model-method). Thus, we compared the genomic-enabled prediction accuracy of a total of 16 model-method combinations on two maize data sets with positive phenotypic correlations among environments, and on two wheat data sets with complex G×E that includes some negative and close to zero phenotypic correlations among environments. The two models (MDs and MDE with the random intercept of the lines and the GK method) were computationally efficient and gave high prediction accuracy in the two maize data sets. Regarding the more complex G×E wheat data sets, the prediction accuracy of the model-method combination with G×E, MDs and MDe, including the random intercepts of the lines with GK method had important savings in computing time as compared with the G×E interaction multi-environment models with unstructured variance-covariances but with lower genomic prediction accuracy. Copyright © 2018 Cuevas et al.

  8. Genomic-Enabled Prediction Kernel Models with Random Intercepts for Multi-environment Trials

    PubMed Central

    Cuevas, Jaime; Granato, Italo; Fritsche-Neto, Roberto; Montesinos-Lopez, Osval A.; Burgueño, Juan; Bandeira e Sousa, Massaine; Crossa, José

    2018-01-01

    In this study, we compared the prediction accuracy of the main genotypic effect model (MM) without G×E interactions, the multi-environment single variance G×E deviation model (MDs), and the multi-environment environment-specific variance G×E deviation model (MDe) where the random genetic effects of the lines are modeled with the markers (or pedigree). With the objective of further modeling the genetic residual of the lines, we incorporated the random intercepts of the lines (l) and generated another three models. Each of these 6 models were fitted with a linear kernel method (Genomic Best Linear Unbiased Predictor, GB) and a Gaussian Kernel (GK) method. We compared these 12 model-method combinations with another two multi-environment G×E interactions models with unstructured variance-covariances (MUC) using GB and GK kernels (4 model-method). Thus, we compared the genomic-enabled prediction accuracy of a total of 16 model-method combinations on two maize data sets with positive phenotypic correlations among environments, and on two wheat data sets with complex G×E that includes some negative and close to zero phenotypic correlations among environments. The two models (MDs and MDE with the random intercept of the lines and the GK method) were computationally efficient and gave high prediction accuracy in the two maize data sets. Regarding the more complex G×E wheat data sets, the prediction accuracy of the model-method combination with G×E, MDs and MDe, including the random intercepts of the lines with GK method had important savings in computing time as compared with the G×E interaction multi-environment models with unstructured variance-covariances but with lower genomic prediction accuracy. PMID:29476023

  9. The Weberian Legacy of Thom Greenfield.

    ERIC Educational Resources Information Center

    Samier, Eugenie

    1996-01-01

    Traces through Thomas Greenfield's work his use of Max Weber's interpretive social analysis, including Weber's view of the individual unit of analysis, value topologies, comparative history methods, and analytical ideal topologies. Compares Greenfield's and Weber's metaphysical assumptions, ontological perspectives, and epistemological frameworks.…

  10. Grading Medical Students in a Psychiatry Clerkship: Correlation with the NBME Subject Examination Scores and Its Implications

    ERIC Educational Resources Information Center

    Ramchandani, Dilip

    2011-01-01

    Background/Objective: The author analyzed and compared various assessment methods for assessment of medical students; these methods included clinical assessment and the standardized National Board of Medical Education (NBME) subject examination. Method: Students were evaluated on their 6-week clerkship in psychiatry by both their clinical…

  11. Problem-Based Learning Method: Secondary Education 10th Grade Chemistry Course Mixtures Topic

    ERIC Educational Resources Information Center

    Üce, Musa; Ates, Ismail

    2016-01-01

    In this research; aim was determining student achievement by comparing problem-based learning method with teacher-centered traditional method of teaching 10th grade chemistry lesson mixtures topic. Pretest-posttest control group research design is implemented. Research sample includes; two classes of (total of 48 students) an Anatolian High School…

  12. Analysis of the Interaction of Student Characteristics with Method in Micro-Teaching.

    ERIC Educational Resources Information Center

    Chavers, Katherine; And Others

    A study examined the comparative effects on microteaching performance of (1) eight different methods of teacher training and (2) the interaction of method with student characteristics. Subjects, 71 enrollees in an educational psychology course, were randomly assigned to eight treatment groups (including one control group). Treatments consisted of…

  13. Evaluation of dysphagia in early stroke patients by bedside, endoscopic, and electrophysiological methods.

    PubMed

    Umay, Ebru Karaca; Unlu, Ece; Saylam, Guleser Kılıc; Cakci, Aytul; Korkmaz, Hakan

    2013-09-01

    We aimed in this study to evaluate dysphagia in early stroke patients using a bedside screening test and flexible fiberoptic endoscopic evaluation of swallowing (FFEES) and electrophysiological evaluation (EE) methods and to compare the effectiveness of these methods. Twenty-four patients who were hospitalized in our clinic within the first 3 months after stroke were included in this study. Patients were evaluated using a bedside screening test [including bedside dysphagia score (BDS), neurological examination dysphagia score (NEDS), and total dysphagia score (TDS)] and FFEES and EE methods. Patients were divided into normal-swallowing and dysphagia groups according to the results of the evaluation methods. Patients with dysphagia as determined by any of these methods were compared to the patients with normal swallowing based on the results of the other two methods. Based on the results of our study, a high BDS was positively correlated with dysphagia identified by FFEES and EE methods. Moreover, the FFEES and EE methods were positively correlated. There was no significant correlation between NEDS and TDS levels and either EE or FFEES method. Bedside screening tests should be used mainly as an initial screening test; then FFEES and EE methods should be combined in patients who show risks. This diagnostic algorithm may provide a practical and fast solution for selected stroke patients.

  14. An Analysis of the Effects of Instructional Methods Upon Selected Outcomes of Instruction in an Interdisciplinary Science Unit.

    ERIC Educational Resources Information Center

    Thomas, Barbara Schalk

    Studied was the effect of instructional method on educational outcomes in an interdisciplinary science uni t taught to 143 eighth grade students of earth science. Compared were the didactic and guided discovery methods of teaching. Also analyzed were the interactions of methods with student characteristics including sex, intelligence, creativity,…

  15. Method and apparatus for signal processing in a sensor system for use in spectroscopy

    DOEpatents

    O'Connor, Paul [Bellport, NY; DeGeronimo, Gianluigi [Nesconset, NY; Grosholz, Joseph [Natrona Heights, PA

    2008-05-27

    A method for processing pulses arriving randomly in time on at least one channel using multiple peak detectors includes asynchronously selecting a non-busy peak detector (PD) in response to a pulse-generated trigger signal, connecting the channel to the selected PD in response to the trigger signal, and detecting a pulse peak amplitude. Amplitude and time of arrival data are output in first-in first-out (FIFO) sequence. An apparatus includes trigger comparators to generate the trigger signal for the pulse-receiving channel, PDs, a switch for connecting the channel to the selected PD, and logic circuitry which maintains the write pointer. Also included, time-to-amplitude converters (TACs) convert time of arrival to analog voltage and an analog multiplexer provides FIFO output. A multi-element sensor system for spectroscopy includes detector elements, channels, trigger comparators, PDs, a switch, and a logic circuit with asynchronous write pointer. The system includes TACs, a multiplexer and analog-to-digital converter.

  16. A scoping review of rapid review methods.

    PubMed

    Tricco, Andrea C; Antony, Jesmin; Zarin, Wasifa; Strifler, Lisa; Ghassemi, Marco; Ivory, John; Perrier, Laure; Hutton, Brian; Moher, David; Straus, Sharon E

    2015-09-16

    Rapid reviews are a form of knowledge synthesis in which components of the systematic review process are simplified or omitted to produce information in a timely manner. Although numerous centers are conducting rapid reviews internationally, few studies have examined the methodological characteristics of rapid reviews. We aimed to examine articles, books, and reports that evaluated, compared, used or described rapid reviews or methods through a scoping review. MEDLINE, EMBASE, the Cochrane Library, internet websites of rapid review producers, and reference lists were searched to identify articles for inclusion. Two reviewers independently screened literature search results and abstracted data from included studies. Descriptive analysis was conducted. We included 100 articles plus one companion report that were published between 1997 and 2013. The studies were categorized as 84 application papers, seven development papers, six impact papers, and four comparison papers (one was included in two categories). The rapid reviews were conducted between 1 and 12 months, predominantly in Europe (58 %) and North America (20 %). The included studies failed to report 6 % to 73 % of the specific systematic review steps examined. Fifty unique rapid review methods were identified; 16 methods occurred more than once. Streamlined methods that were used in the 82 rapid reviews included limiting the literature search to published literature (24 %) or one database (2 %), limiting inclusion criteria by date (68 %) or language (49 %), having one person screen and another verify or screen excluded studies (6 %), having one person abstract data and another verify (23 %), not conducting risk of bias/quality appraisal (7 %) or having only one reviewer conduct the quality appraisal (7 %), and presenting results as a narrative summary (78 %). Four case studies were identified that compared the results of rapid reviews to systematic reviews. Three studies found that the conclusions between rapid reviews and systematic reviews were congruent. Numerous rapid review approaches were identified and few were used consistently in the literature. Poor quality of reporting was observed. A prospective study comparing the results from rapid reviews to those obtained through systematic reviews is warranted.

  17. Including mixed methods research in systematic reviews: examples from qualitative syntheses in TB and malaria control.

    PubMed

    Atkins, Salla; Launiala, Annika; Kagaha, Alexander; Smith, Helen

    2012-04-30

    Health policy makers now have access to a greater number and variety of systematic reviews to inform different stages in the policy making process, including reviews of qualitative research. The inclusion of mixed methods studies in systematic reviews is increasing, but these studies pose particular challenges to methods of review. This article examines the quality of the reporting of mixed methods and qualitative-only studies. We used two completed systematic reviews to generate a sample of qualitative studies and mixed method studies in order to make an assessment of how the quality of reporting and rigor of qualitative-only studies compares with that of mixed-methods studies. Overall, the reporting of qualitative studies in our sample was consistently better when compared with the reporting of mixed methods studies. We found that mixed methods studies are less likely to provide a description of the research conduct or qualitative data analysis procedures and less likely to be judged credible or provide rich data and thick description compared with standalone qualitative studies. Our time-related analysis shows that for both types of study, papers published since 2003 are more likely to report on the study context, describe analysis procedures, and be judged credible and provide rich data. However, the reporting of other aspects of research conduct (i.e. descriptions of the research question, the sampling strategy, and data collection methods) in mixed methods studies does not appear to have improved over time. Mixed methods research makes an important contribution to health research in general, and could make a more substantial contribution to systematic reviews. Through our careful analysis of the quality of reporting of mixed methods and qualitative-only research, we have identified areas that deserve more attention in the conduct and reporting of mixed methods research.

  18. Comparative evaluation of power factor impovement techniques for squirrel cage induction motors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spee, R.; Wallace, A.K.

    1992-04-01

    This paper describes the results obtained from a series of tests of relatively simple methods of improving the power factor of squirrel-cage induction motors. The methods, which are evaluated under controlled laboratory conditions for a 10-hp, high-efficiency motor, include terminal voltage reduction; terminal static capacitors; and a floating'' winding with static capacitors. The test results are compared with equivalent circuit model predictions that are then used to identify optimum conditions for each of the power factor improvement techniques compared with the basic induction motor. Finally, the relative economic value, and the implications of component failures, of the three methods aremore » discussed.« less

  19. Comparing Interrater reliability between eye examination and eye self-examination 1

    PubMed Central

    de Lima, Maria Alzete; Pagliuca, Lorita Marlena Freitag; do Nascimento, Jennara Cândido; Caetano, Joselany Áfio

    2017-01-01

    Resume Objective: to compare Interrater reliability concerning two eye assessment methods. Method: quasi-experimental study conducted with 324 college students including eye self-examination and eye assessment performed by the researchers in a public university. Kappa coefficient was used to verify agreement. Results: reliability coefficients between Interraters ranged from 0.85 to 0.95, with statistical significance at 0.05. The exams to check for near acuity and peripheral vision presented a reasonable kappa >0.2. The remaining coefficients were higher, ranging from very to totally reliable. Conclusion: comparatively, the results of both methods were similar. The virtual manual on eye self-examination can be used to screen for eye conditions. PMID:29069269

  20. Pharmacoeconomics

    PubMed Central

    Hughes, Dyfrig A

    2012-01-01

    Pharmacoeconomics is an essential component of health technology assessment and the appraisal of medicines for use by UK National Health Service (NHS) patients. As a comparatively young discipline, its methods continue to evolve. Priority research areas for development include methods for synthesizing indirect comparisons when head-to-head trials have not been performed, synthesizing qualitative evidence (for example, stakeholder views), addressing the limitations of the EQ-5D tool for assessing quality of life, including benefits not captured in quality-adjusted life years (QALYs), ways of assessing valuation methods (for determining utility scores), extrapolation of costs and benefits beyond those observed in trials, early estimation of cost-effectiveness (including mechanism-based economic evaluation), methods for incorporating the impact of non-adherence and the role of behavioural economics in influencing patients and prescribers. PMID:22360714

  1. Comparison of four different reduction methods for anterior dislocation of the shoulder.

    PubMed

    Guler, Olcay; Ekinci, Safak; Akyildiz, Faruk; Tirmik, Uzeyir; Cakmak, Selami; Ugras, Akin; Piskin, Ahmet; Mahirogullari, Mahir

    2015-05-28

    Shoulder dislocations account for almost 50% of all major joint dislocations and are mainly anterior. The aim is a comparative retrospective study of different reduction maneuvers without anesthesia to reduce the dislocated shoulder. Patients were treated with different reduction maneuvers, including various forms of traction and external rotation, in the emergency departments of four training hospitals between 2009 and 2012. Each of the four hospitals had different treatment protocols for reduction and applying one of four maneuvers: Spaso, Chair, Kocher, and Matsen methods. Thirty-nine patients were treated by the Spaso method, 47 by the Chair reduction method, 40 by the Kocher method, and 27 patients by Matsen's traction-countertraction method. All patients' demographic data were recorded. Dislocation number, reduction time, time interval between dislocation and reduction, and associated complications, pre- and post-reduction period, were recorded prospectively. No anesthetic method was used for the reduction. All of the methods used included traction and some external rotation. The Chair method had the shortest reduction time. All surgeons involved in the study agreed that the Kocher and Matsen methods needed more force for the reduction. Patients could contract their muscles because of the pain in these two methods. The Spaso method includes flexion of the shoulder and blocks muscle contraction somewhat. The Chair method was found to be the easiest because the patients could not contract their muscles while sitting on a chair with the affected arm at their side. We suggest that the Chair method is an effective and fast reduction maneuver that may be an alternative for the treatment of anterior shoulder dislocations. Further prospective studies with larger sample size are needed to compare safety of different reduction techniques.

  2. A Focus on Problems of National Interest in the College General Chemistry Laboratory: The Effects of the Problem-Oriented Method Compared with Those of the Traditional Approach.

    ERIC Educational Resources Information Center

    Neman, Robert Lynn

    This study was designed to assess the effects of the problem-oriented method compared to those of the traditional approach in general chemistry at the college level. The problem-oriented course included topics such as air and water pollution, drug addiction and analysis, tetraethyl-lead additives, insecticides in the environment, and recycling of…

  3. Problems With Risk Reclassification Methods for Evaluating Prediction Models

    PubMed Central

    Pepe, Margaret S.

    2011-01-01

    For comparing the performance of a baseline risk prediction model with one that includes an additional predictor, a risk reclassification analysis strategy has been proposed. The first step is to cross-classify risks calculated according to the 2 models for all study subjects. Summary measures including the percentage of reclassification and the percentage of correct reclassification are calculated, along with 2 reclassification calibration statistics. The author shows that interpretations of the proposed summary measures and P values are problematic. The author's recommendation is to display the reclassification table, because it shows interesting information, but to use alternative methods for summarizing and comparing model performance. The Net Reclassification Index has been suggested as one alternative method. The author argues for reporting components of the Net Reclassification Index because they are more clinically relevant than is the single numerical summary measure. PMID:21555714

  4. Extraction of intracellular protein from Glaciozyma antarctica for proteomics analysis

    NASA Astrophysics Data System (ADS)

    Faizura, S. Nor; Farahayu, K.; Faizal, A. B. Mohd; Asmahani, A. A. S.; Amir, R.; Nazalan, N.; Diba, A. B. Farah; Muhammad, M. Nor; Munir, A. M. Abdul

    2013-11-01

    Two preparation methods of crude extracts of psychrophilic yeast Glaciozyma antarctica were compared in order to obtain a good recovery of intracellular proteins. Extraction with mechanical procedures using sonication was found to be more effective for obtaining good yield compare to alkaline treatment method. The procedure is simple, rapid, and produce better yield. A total of 52 proteins were identified by combining both extraction methods. Most of the proteins identified in this study involves in the metabolic process including glycolysis pathway, pentose phosphate pathway, pyruyate decarboxylation and also urea cyle. Several chaperons were identified including probable cpr1-cyclophilin (peptidylprolyl isomerase), macrolide-binding protein fkbp12 and heat shock proteins which were postulate to accelerate proper protein folding. Characteristic of the fundamental cellular processes inferred from the expressed-proteome highlight the evolutionary and functional complexity existing in this domain of life.

  5. Quiescent period respiratory gating for PET∕CT

    PubMed Central

    Liu, Chi; Alessio, Adam; Pierce, Larry; Thielemans, Kris; Wollenweber, Scott; Ganin, Alexander; Kinahan, Paul

    2010-01-01

    Purpose: To minimize respiratory motion artifacts, this work proposes quiescent period gating (QPG) methods that extract PET data from the end-expiration quiescent period and form a single PET frame with reduced motion and improved signal-to-noise properties. Methods: Two QPG methods are proposed and evaluated. Histogram-based quiescent period gating (H-QPG) extracts a fraction of PET data determined by a window of the respiratory displacement signal histogram. Cycle-based quiescent period gating (C-QPG) extracts data with a respiratory displacement signal below a specified threshold of the maximum amplitude of each individual respiratory cycle. Performances of both QPG methods were compared to ungated and five-bin phase-gated images across 21 FDG-PET∕CT patient data sets containing 31 thorax and abdomen lesions as well as with computer simulations driven by 1295 different patient respiratory traces. Image quality was evaluated in terms of the lesion SUVmax and the fraction of counts included in each gate as a surrogate for image noise. Results: For all the gating methods, image noise artifactually increases SUVmax when the fraction of counts included in each gate is less than 50%. While simulation data show that H-QPG is superior to C-QPG, the H-QPG and C-QPG methods lead to similar quantification-noise tradeoffs in patient data. Compared to ungated images, both QPG methods yield significantly higher lesion SUVmax. Compared to five-bin phase gating, the QPG methods yield significantly larger fraction of counts with similar SUVmax improvement. Both QPG methods result in increased lesion SUVmax for patients whose lesions have longer quiescent periods. Conclusions: Compared to ungated and phase-gated images, the QPG methods lead to images with less motion blurring and an improved compromise between SUVmax and fraction of counts. The QPG methods for respiratory motion compensation could effectively improve tumor quantification with minimal noise increase. PMID:20964223

  6. System and method for controlling a combustor assembly

    DOEpatents

    York, William David; Ziminsky, Willy Steve; Johnson, Thomas Edward; Stevenson, Christian Xavier

    2013-03-05

    A system and method for controlling a combustor assembly are disclosed. The system includes a combustor assembly. The combustor assembly includes a combustor and a fuel nozzle assembly. The combustor includes a casing. The fuel nozzle assembly is positioned at least partially within the casing and includes a fuel nozzle. The fuel nozzle assembly further defines a head end. The system further includes a viewing device configured for capturing an image of at least a portion of the head end, and a processor communicatively coupled to the viewing device, the processor configured to compare the image to a standard image for the head end.

  7. The Teaching of Anthropology: A Comparative Study.

    ERIC Educational Resources Information Center

    Lombard, Jacques

    1984-01-01

    College-level anthropology teaching in various countries, including Belgium, France, Germany, the Netherlands, Portugal, South Africa, the United Kingdom, and Yugoslavia, is compared. Terminology is examined and historical background is provided. Also discussed are educational crises, the organization of teaching, and teaching methods. (RM)

  8. Methods for Equating Mental Tests.

    DTIC Science & Technology

    1984-11-01

    1983) compared conventional and IRT methods for equating the Test of English as a Foreign Language ( TOEFL ) after chaining. Three conventional and...three IRT equating methods were examined in this study; two sections of TOEFL were each (separately) equated. The IRT methods included the following: (a...group. A separate base form was established for each of the six equating methods. Instead of equating the base-form TOEFL to itself, the last (eighth

  9. A Rotor Tip Vortex Tracing Algorithm for Image Post-Processing

    NASA Technical Reports Server (NTRS)

    Overmeyer, Austin D.

    2015-01-01

    A neurite tracing algorithm, originally developed for medical image processing, was used to trace the location of the rotor tip vortex in density gradient flow visualization images. The tracing algorithm was applied to several representative test images to form case studies. The accuracy of the tracing algorithm was compared to two current methods including a manual point and click method and a cross-correlation template method. It is shown that the neurite tracing algorithm can reduce the post-processing time to trace the vortex by a factor of 10 to 15 without compromising the accuracy of the tip vortex location compared to other methods presented in literature.

  10. Estimating two indirect logging costs caused by accelerated erosion.

    Treesearch

    Glen O. Klock

    1976-01-01

    In forest areas where high soil erosion potential exists, a comparative yarding cost estimate, including the indirect costs determined by methods proposed here, shows that the total cost of using "advanced" logging methods may be less than that of "traditional" systems.

  11. Comparison of observed and predicted abutment scour at selected bridges in Maine.

    DOT National Transportation Integrated Search

    2008-01-01

    Maximum abutment-scour depths predicted with five different methods were compared to : maximum abutment-scour depths observed at 100 abutments at 50 bridge sites in Maine with a : median bridge age of 66 years. Prediction methods included the Froehli...

  12. Methods and apparatus for reducing peak wind turbine loads

    DOEpatents

    Moroz, Emilian Mieczyslaw

    2007-02-13

    A method for reducing peak loads of wind turbines in a changing wind environment includes measuring or estimating an instantaneous wind speed and direction at the wind turbine and determining a yaw error of the wind turbine relative to the measured instantaneous wind direction. The method further includes comparing the yaw error to a yaw error trigger that has different values at different wind speeds and shutting down the wind turbine when the yaw error exceeds the yaw error trigger corresponding to the measured or estimated instantaneous wind speed.

  13. Prediction of jump phenomena in rotationally-coupled maneuvers of aircraft, including nonlinear aerodynamic effects

    NASA Technical Reports Server (NTRS)

    Young, J. W.; Schy, A. A.; Johnson, K. G.

    1977-01-01

    An analytical method has been developed for predicting critical control inputs for which nonlinear rotational coupling may cause sudden jumps in aircraft response. The analysis includes the effect of aerodynamics which are nonlinear in angle of attack. The method involves the simultaneous solution of two polynomials in roll rate, whose coefficients are functions of angle of attack and the control inputs. Results obtained using this procedure are compared with calculated time histories to verify the validity of the method for predicting jump-like instabilities.

  14. Wind Tunnel Force Balance Calibration Study - Interim Results

    NASA Technical Reports Server (NTRS)

    Rhew, Ray D.

    2012-01-01

    Wind tunnel force balance calibration is preformed utilizing a variety of different methods and does not have a direct traceable standard such as standards used for most calibration practices (weights, and voltmeters). These different calibration methods and practices include, but are not limited to, the loading schedule, the load application hardware, manual and automatic systems, re-leveling and non-re-leveling. A study of the balance calibration techniques used by NASA was undertaken to develop metrics for reviewing and comparing results using sample calibrations. The study also includes balances of different designs, single and multi-piece. The calibration systems include, the manual, and the automatic that are provided by NASA and its vendors. The results to date will be presented along with the techniques for comparing the results. In addition, future planned calibrations and investigations based on the results will be provided.

  15. Roll, Spin, Wash, or Filter? Processing of Lipoaspirate for Autologous Fat Grafting: An Updated, Evidence-Based Review of the Literature.

    PubMed

    Cleveland, Emily C; Albano, Nicholas J; Hazen, Alexes

    2015-10-01

    The use of autologous adipose tissue harvested through liposuction techniques for soft-tissue augmentation has become commonplace among cosmetic and reconstructive surgeons alike. Despite its longstanding use in the plastic surgery community, substantial controversy remains regarding the optimal method of processing harvested lipoaspirate before grafting. This evidence-based review builds on prior examinations of the literature to evaluate both established and novel methods for lipoaspirate processing. A comprehensive, systematic review of the literature was conducted using Ovid MEDLINE in January of 2015 to identify all relevant publications subsequent to the most recent review on this topic. Randomized controlled trials, clinical trials, and comparative studies comparing at least two of the following techniques were included: decanting, cotton gauze (Telfa) rolling, centrifugation, washing, filtration, and stromal vascular fraction isolation. Nine articles comparing various methods of processing human fat for autologous grafting were selected based on inclusion and exclusion criteria. Five compared established processing techniques (i.e., decanting, cotton gauze rolling, centrifugation, and washing) and four publications evaluated newer proprietary technologies, including washing, filtration, and/or methods to isolate stromal vascular fraction. The authors failed to find compelling evidence to advocate a single technique as the superior method for processing lipoaspirate in preparation for autologous fat grafting. A paucity of high-quality data continues to limit the clinician's ability to determine the optimal method for purifying harvested adipose tissue. Novel automated technologies hold promise, particularly for large-volume fat grafting; however, extensive additional research is required to understand their true utility and efficiency in clinical settings.

  16. Assessment of higher order structure comparability in therapeutic proteins using nuclear magnetic resonance spectroscopy.

    PubMed

    Amezcua, Carlos A; Szabo, Christina M

    2013-06-01

    In this work, we applied nuclear magnetic resonance (NMR) spectroscopy to rapidly assess higher order structure (HOS) comparability in protein samples. Using a variation of the NMR fingerprinting approach described by Panjwani et al. [2010. J Pharm Sci 99(8):3334-3342], three nonglycosylated proteins spanning a molecular weight range of 6.5-67 kDa were analyzed. A simple statistical method termed easy comparability of HOS by NMR (ECHOS-NMR) was developed. In this method, HOS similarity between two samples is measured via the correlation coefficient derived from linear regression analysis of binned NMR spectra. Applications of this method include HOS comparability assessment during new product development, manufacturing process changes, supplier changes, next-generation products, and the development of biosimilars to name just a few. We foresee ECHOS-NMR becoming a routine technique applied to comparability exercises used to complement data from other analytical techniques. Copyright © 2013 Wiley Periodicals, Inc.

  17. Biases and power for groups comparison on subjective health measurements.

    PubMed

    Hamel, Jean-François; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Roquelaure, Yves; Sébille, Véronique

    2012-01-01

    Subjective health measurements are increasingly used in clinical research, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: so-called classical test theory (CTT), relying on observed scores and models coming from Item Response Theory (IRT) relying on a response model relating the items responses to a latent parameter, often called latent trait. Whether IRT or CTT would be the most appropriate method to compare two independent groups of patients on a patient reported outcomes measurement remains unknown and was investigated using simulations. For CTT-based analyses, groups comparison was performed using t-test on the scores. For IRT-based analyses, several methods were compared, according to whether the Rasch model was considered with random effects or with fixed effects, and the group effect was included as a covariate or not. Individual latent traits values were estimated using either a deterministic method or by stochastic approaches. Latent traits were then compared with a t-test. Finally, a two-steps method was performed to compare the latent trait distributions, and a Wald test was performed to test the group effect in the Rasch model including group covariates. The only unbiased IRT-based method was the group covariate Wald's test, performed on the random effects Rasch model. This model displayed the highest observed power, which was similar to the power using the score t-test. These results need to be extended to the case frequently encountered in practice where data are missing and possibly informative.

  18. Comprehensive comparative analysis of 5'-end RNA-sequencing methods.

    PubMed

    Adiconis, Xian; Haber, Adam L; Simmons, Sean K; Levy Moonshine, Ami; Ji, Zhe; Busby, Michele A; Shi, Xi; Jacques, Justin; Lancaster, Madeline A; Pan, Jen Q; Regev, Aviv; Levin, Joshua Z

    2018-06-04

    Specialized RNA-seq methods are required to identify the 5' ends of transcripts, which are critical for studies of gene regulation, but these methods have not been systematically benchmarked. We directly compared six such methods, including the performance of five methods on a single human cellular RNA sample and a new spike-in RNA assay that helps circumvent challenges resulting from uncertainties in annotation and RNA processing. We found that the 'cap analysis of gene expression' (CAGE) method performed best for mRNA and that most of its unannotated peaks were supported by evidence from other genomic methods. We applied CAGE to eight brain-related samples and determined sample-specific transcription start site (TSS) usage, as well as a transcriptome-wide shift in TSS usage between fetal and adult brain.

  19. The use of rapid review methods in health technology assessments: 3 case studies.

    PubMed

    Kaltenthaler, Eva; Cooper, Katy; Pandor, Abdullah; Martyn-St James, Marrissa; Chatters, Robin; Wong, Ruth

    2016-08-26

    Rapid reviews are of increasing importance within health technology assessment due to time and resource constraints. There are many rapid review methods available although there is little guidance as to the most suitable methods. We present three case studies employing differing methods to suit the evidence base for each review and outline some issues to consider when selecting an appropriate method. Three recently completed systematic review short reports produced for the UK National Institute for Health Research were examined. Different approaches to rapid review methods were used in the three reports which were undertaken to inform the commissioning of services within the NHS and to inform future trial design. We describe the methods used, the reasoning behind the choice of methods and explore the strengths and weaknesses of each method. Rapid review methods were chosen to meet the needs of the review and each review had distinctly different challenges such as heterogeneity in terms of populations, interventions, comparators and outcome measures (PICO) and/or large numbers of relevant trials. All reviews included at least 10 randomised controlled trials (RCTs), each with numerous included outcomes. For the first case study (sexual health interventions), very diverse studies in terms of PICO were included. P-values and summary information only were presented due to substantial heterogeneity between studies and outcomes measured. For the second case study (premature ejaculation treatments), there were over 100 RCTs but also several existing systematic reviews. Data for meta-analyses were extracted directly from existing systematic reviews with new RCT data added where available. For the final case study (cannabis cessation therapies), studies included a wide range of interventions and considerable variation in study populations and outcomes. A brief summary of the key findings for each study was presented and narrative synthesis used to summarise results for each pair of interventions compared. Rapid review methods need to be chosen to meet both the nature of the evidence base of a review and the challenges presented by the included studies. Appropriate methods should be chosen after an assessment of the evidence base.

  20. Multi-parametric centrality method for graph network models

    NASA Astrophysics Data System (ADS)

    Ivanov, Sergei Evgenievich; Gorlushkina, Natalia Nikolaevna; Ivanova, Lubov Nikolaevna

    2018-04-01

    The graph model networks are investigated to determine centrality, weights and the significance of vertices. For centrality analysis appliesa typical method that includesany one of the properties of graph vertices. In graph theory, methods of analyzing centrality are used: in terms by degree, closeness, betweenness, radiality, eccentricity, page-rank, status, Katz and eigenvector. We have proposed a new method of multi-parametric centrality, which includes a number of basic properties of the network member. The mathematical model of multi-parametric centrality method is developed. Comparison of results for the presented method with the centrality methods is carried out. For evaluate the results for the multi-parametric centrality methodthe graph model with hundreds of vertices is analyzed. The comparative analysis showed the accuracy of presented method, includes simultaneously a number of basic properties of vertices.

  1. Does time-lapse imaging have favorable results for embryo incubation and selection compared with conventional methods in clinical in vitro fertilization? A meta-analysis and systematic review of randomized controlled trials.

    PubMed

    Chen, Minghao; Wei, Shiyou; Hu, Junyan; Yuan, Jing; Liu, Fenghua

    2017-01-01

    The present study aimed to undertake a review of available evidence assessing whether time-lapse imaging (TLI) has favorable outcomes for embryo incubation and selection compared with conventional methods in clinical in vitro fertilization (IVF). Using PubMed, EMBASE, Cochrane library and ClinicalTrial.gov up to February 2017 to search for randomized controlled trials (RCTs) comparing TLI versus conventional methods. Both studies randomized women and oocytes were included. For studies randomized women, the primary outcomes were live birth and ongoing pregnancy, the secondary outcomes were clinical pregnancy and miscarriage; for studies randomized oocytes, the primary outcome was blastocyst rate, the secondary outcome was good quality embryo on Day 2/3. Subgroup analysis was conducted based on different incubation and embryo selection between groups. Ten RCTs were included, four randomized oocytes and six randomized women. For oocyte-based review, the pool-analysis observed no significant difference between TLI group and control group for blastocyst rate [relative risk (RR) 1.08, 95% CI 0.94-1.25, I2 = 0%, two studies, including 1154 embryos]. The quality of evidence was moderate for all outcomes in oocyte-based review. For woman-based review, only one study provided live birth rate (RR 1,23, 95% CI 1.06-1.44,I2 N/A, one study, including 842 women), the pooled result showed no significant difference in ongoing pregnancy rate (RR 1.04, 95% CI 0.80-1.36, I2 = 59%, four studies, including 1403 women) between two groups. The quality of the evidence was low or very low for all outcomes in woman-based review. Currently there is insufficient evidence to support that TLI is superior to conventional methods for human embryo incubation and selection. In consideration of the limitations and flaws of included studies, more well designed RCTs are still in need to comprehensively evaluate the effectiveness of clinical TLI use.

  2. A scoping review of malaria forecasting: past work and future directions

    PubMed Central

    Zinszer, Kate; Verma, Aman D; Charland, Katia; Brewer, Timothy F; Brownstein, John S; Sun, Zhuoyu; Buckeridge, David L

    2012-01-01

    Objectives There is a growing body of literature on malaria forecasting methods and the objective of our review is to identify and assess methods, including predictors, used to forecast malaria. Design Scoping review. Two independent reviewers searched information sources, assessed studies for inclusion and extracted data from each study. Information sources Search strategies were developed and the following databases were searched: CAB Abstracts, EMBASE, Global Health, MEDLINE, ProQuest Dissertations & Theses and Web of Science. Key journals and websites were also manually searched. Eligibility criteria for included studies We included studies that forecasted incidence, prevalence or epidemics of malaria over time. A description of the forecasting model and an assessment of the forecast accuracy of the model were requirements for inclusion. Studies were restricted to human populations and to autochthonous transmission settings. Results We identified 29 different studies that met our inclusion criteria for this review. The forecasting approaches included statistical modelling, mathematical modelling and machine learning methods. Climate-related predictors were used consistently in forecasting models, with the most common predictors being rainfall, relative humidity, temperature and the normalised difference vegetation index. Model evaluation was typically based on a reserved portion of data and accuracy was measured in a variety of ways including mean-squared error and correlation coefficients. We could not compare the forecast accuracy of models from the different studies as the evaluation measures differed across the studies. Conclusions Applying different forecasting methods to the same data, exploring the predictive ability of non-environmental variables, including transmission reducing interventions and using common forecast accuracy measures will allow malaria researchers to compare and improve models and methods, which should improve the quality of malaria forecasting. PMID:23180505

  3. Die sokratische Lehrstrategie und ihre Relevanz fur die heutige Didaktik (The Socratic Method and Its Relevance for Modern Teaching).

    ERIC Educational Resources Information Center

    Kanakis, Ioannis

    1997-01-01

    Examines the Socratic method through a comparative analysis of early Platonic dialogs with theories of critical rationalism and cognitive theories based on achievement motivation. Presents details of the Socratic strategy of teaching and learning, including critical reflection, conversation, and intellectual honesty; asserts that these methods are…

  4. Factor Retention in Exploratory Factor Analysis: A Comparison of Alternative Methods.

    ERIC Educational Resources Information Center

    Mumford, Karen R.; Ferron, John M.; Hines, Constance V.; Hogarty, Kristine Y.; Kromrey, Jeffery D.

    This study compared the effectiveness of 10 methods of determining the number of factors to retain in exploratory common factor analysis. The 10 methods included the Kaiser rule and a modified Kaiser criterion, 3 variations of parallel analysis, 4 regression-based variations of the scree procedure, and the minimum average partial procedure. The…

  5. Does time-lapse imaging have favorable results for embryo incubation and selection compared with conventional methods in clinical in vitro fertilization? A meta-analysis and systematic review of randomized controlled trials

    PubMed Central

    Yuan, Jing; Liu, Fenghua

    2017-01-01

    Objective The present study aimed to undertake a review of available evidence assessing whether time-lapse imaging (TLI) has favorable outcomes for embryo incubation and selection compared with conventional methods in clinical in vitro fertilization (IVF). Methods Using PubMed, EMBASE, Cochrane library and ClinicalTrial.gov up to February 2017 to search for randomized controlled trials (RCTs) comparing TLI versus conventional methods. Both studies randomized women and oocytes were included. For studies randomized women, the primary outcomes were live birth and ongoing pregnancy, the secondary outcomes were clinical pregnancy and miscarriage; for studies randomized oocytes, the primary outcome was blastocyst rate, the secondary outcome was good quality embryo on Day 2/3. Subgroup analysis was conducted based on different incubation and embryo selection between groups. Results Ten RCTs were included, four randomized oocytes and six randomized women. For oocyte-based review, the pool-analysis observed no significant difference between TLI group and control group for blastocyst rate [relative risk (RR) 1.08, 95% CI 0.94–1.25, I2 = 0%, two studies, including 1154 embryos]. The quality of evidence was moderate for all outcomes in oocyte-based review. For woman-based review, only one study provided live birth rate (RR 1,23, 95% CI 1.06–1.44,I2 N/A, one study, including 842 women), the pooled result showed no significant difference in ongoing pregnancy rate (RR 1.04, 95% CI 0.80–1.36, I2 = 59%, four studies, including 1403 women) between two groups. The quality of the evidence was low or very low for all outcomes in woman-based review. Conclusions Currently there is insufficient evidence to support that TLI is superior to conventional methods for human embryo incubation and selection. In consideration of the limitations and flaws of included studies, more well designed RCTs are still in need to comprehensively evaluate the effectiveness of clinical TLI use. PMID:28570713

  6. Differentiated Instruction in the Classroom

    ERIC Educational Resources Information Center

    Kelly, Gretchen

    2013-01-01

    Low achievement on standardized tests may be attributed to many factors, including teaching methods. Differentiated instruction has been identified as a teaching method using different learning modalities that appeal to varied student interests with individualized instruction. The purpose of this quantitative study was to compare whole-group…

  7. A Statistical Method for Syntactic Dialectometry

    ERIC Educational Resources Information Center

    Sanders, Nathan C.

    2010-01-01

    This dissertation establishes the utility and reliability of a statistical distance measure for syntactic dialectometry, expanding dialectometry's methods to include syntax as well as phonology and the lexicon. It establishes the measure's reliability by comparing its results to those of dialectology and phonological dialectometry on Swedish…

  8. Comparison of the performance of different DFT methods in the calculations of the molecular structure and vibration spectra of serotonin (5-hydroxytryptamine, 5-HT)

    NASA Astrophysics Data System (ADS)

    Yang, Yue; Gao, Hongwei

    2012-04-01

    Serotonin (5-hydroxytryptamine, 5-HT) is a monoamine neurotransmitter which plays an important role in treating acute or clinical stress. The comparative performance of different density functional theory (DFT) methods at various basis sets in predicting the molecular structure and vibration spectra of serotonin was reported. The calculation results of different methods including mPW1PW91, HCTH, SVWN, PBEPBE, B3PW91 and B3LYP with various basis sets including LANL2DZ, SDD, LANL2MB, 6-31G, 6-311++G and 6-311+G* were compared with the experimental data. It is remarkable that the SVWN/6-311++G and SVWN/6-311+G* levels afford the best quality to predict the structure of serotonin. The results also indicate that PBEPBE/LANL2DZ level show better performance in the vibration spectra prediction of serotonin than other DFT methods.

  9. Monitoring system and methods for a distributed and recoverable digital control system

    NASA Technical Reports Server (NTRS)

    Stange, Kent (Inventor); Hess, Richard (Inventor); Kelley, Gerald B (Inventor); Rogers, Randy (Inventor)

    2010-01-01

    A monitoring system and methods are provided for a distributed and recoverable digital control system. The monitoring system generally comprises two independent monitoring planes within the control system. The first monitoring plane is internal to the computing units in the control system, and the second monitoring plane is external to the computing units. The internal first monitoring plane includes two in-line monitors. The first internal monitor is a self-checking, lock-step-processing monitor with integrated rapid recovery capability. The second internal monitor includes one or more reasonableness monitors, which compare actual effector position with commanded effector position. The external second monitor plane includes two monitors. The first external monitor includes a pre-recovery computing monitor, and the second external monitor includes a post recovery computing monitor. Various methods for implementing the monitoring functions are also disclosed.

  10. Gaussian Quadrature is an efficient method for the back-transformation in estimating the usual intake distribution when assessing dietary exposure.

    PubMed

    Dekkers, A L M; Slob, W

    2012-10-01

    In dietary exposure assessment, statistical methods exist for estimating the usual intake distribution from daily intake data. These methods transform the dietary intake data to normal observations, eliminate the within-person variance, and then back-transform the data to the original scale. We propose Gaussian Quadrature (GQ), a numerical integration method, as an efficient way of back-transformation. We compare GQ with six published methods. One method uses a log-transformation, while the other methods, including GQ, use a Box-Cox transformation. This study shows that, for various parameter choices, the methods with a Box-Cox transformation estimate the theoretical usual intake distributions quite well, although one method, a Taylor approximation, is less accurate. Two applications--on folate intake and fruit consumption--confirmed these results. In one extreme case, some methods, including GQ, could not be applied for low percentiles. We solved this problem by modifying GQ. One method is based on the assumption that the daily intakes are log-normally distributed. Even if this condition is not fulfilled, the log-transformation performs well as long as the within-individual variance is small compared to the mean. We conclude that the modified GQ is an efficient, fast and accurate method for estimating the usual intake distribution. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Force measuring valve assemblies, systems including such valve assemblies and related methods

    DOEpatents

    DeWall, Kevin George [Pocatello, ID; Garcia, Humberto Enrique [Idaho Falls, ID; McKellar, Michael George [Idaho Falls, ID

    2012-04-17

    Methods of evaluating a fluid condition may include stroking a valve member and measuring a force acting on the valve member during the stroke. Methods of evaluating a fluid condition may include measuring a force acting on a valve member in the presence of fluid flow over a period of time and evaluating at least one of the frequency of changes in the measured force over the period of time and the magnitude of the changes in the measured force over the period of time to identify the presence of an anomaly in a fluid flow and, optionally, its estimated location. Methods of evaluating a valve condition may include directing a fluid flow through a valve while stroking a valve member, measuring a force acting on the valve member during the stroke, and comparing the measured force to a reference force. Valve assemblies and related systems are also disclosed.

  12. System for controlling a hybrid energy system

    DOEpatents

    Hoff, Brian D.; Akasam, Sivaprasad

    2013-01-29

    A method includes identifying a first operating sequence of a repeated operation of at least one non-traction load. The method also includes determining first and second parameters respectively indicative of a requested energy and output energy of the at least one non-traction load and comparing the determined first and second parameters at a plurality of time increments of the first operating sequence. The method also includes determining a third parameter of the hybrid energy system indicative of energy regenerated from the at least one non-traction load and monitoring the third parameter at the plurality of time increments of the first operating sequence. The method also includes determining at least one of an energy deficiency or an energy surplus associated with the non-traction load of the hybrid energy system and selectively adjusting energy stored within the storage device during at least a portion of a second operating sequence.

  13. Comparison and evaluation of fusion methods used for GF-2 satellite image in coastal mangrove area

    NASA Astrophysics Data System (ADS)

    Ling, Chengxing; Ju, Hongbo; Liu, Hua; Zhang, Huaiqing; Sun, Hua

    2018-04-01

    GF-2 satellite is the highest spatial resolution Remote Sensing Satellite of the development history of China's satellite. In this study, three traditional fusion methods including Brovey, Gram-Schmidt and Color Normalized (CN were used to compare with the other new fusion method NNDiffuse, which used the qualitative assessment and quantitative fusion quality index, including information entropy, variance, mean gradient, deviation index, spectral correlation coefficient. Analysis results show that NNDiffuse method presented the optimum in qualitative and quantitative analysis. It had more effective for the follow up of remote sensing information extraction and forest, wetland resources monitoring applications.

  14. Multi-laboratory evaluations of the performance of Catellicoccus marimammalium PCR assays developed to target gull fecal sources

    USGS Publications Warehouse

    Sinigalliano, Christopher D.; Ervin, Jared S.; Van De Werfhorst, Laurie C.; Badgley, Brian D.; Ballestée, Elisenda; Bartkowiaka, Jakob; Boehm, Alexandria B.; Byappanahalli, Muruleedhara N.; Goodwin, Kelly D.; Gourmelon, Michèle; Griffith, John; Holden, Patricia A.; Jay, Jenny; Layton, Blythe; Lee, Cheonghoon; Lee, Jiyoung; Meijer, Wim G.; Noble, Rachel; Raith, Meredith; Ryu, Hodon; Sadowsky, Michael J.; Schriewer, Alexander; Wang, Dan; Wanless, David; Whitman, Richard; Wuertz, Stefan; Santo Domingo, Jorge W.

    2013-01-01

    Here we report results from a multi-laboratory (n = 11) evaluation of four different PCR methods targeting the 16S rRNA gene of Catellicoccus marimammalium originally developed to detect gull fecal contamination in coastal environments. The methods included a conventional end-point PCR method, a SYBR® Green qPCR method, and two TaqMan® qPCR methods. Different techniques for data normalization and analysis were tested. Data analysis methods had a pronounced impact on assay sensitivity and specificity calculations. Across-laboratory standardization of metrics including the lower limit of quantification (LLOQ), target detected but not quantifiable (DNQ), and target not detected (ND) significantly improved results compared to results submitted by individual laboratories prior to definition standardization. The unit of measure used for data normalization also had a pronounced effect on measured assay performance. Data normalization to DNA mass improved quantitative method performance as compared to enterococcus normalization. The MST methods tested here were originally designed for gulls but were found in this study to also detect feces from other birds, particularly feces composited from pigeons. Sequencing efforts showed that some pigeon feces from California contained sequences similar to C. marimammalium found in gull feces. These data suggest that the prevalence, geographic scope, and ecology of C. marimammalium in host birds other than gulls require further investigation. This study represents an important first step in the multi-laboratory assessment of these methods and highlights the need to broaden and standardize additional evaluations, including environmentally relevant target concentrations in ambient waters from diverse geographic regions.

  15. RNA sequencing analysis reveals quiescent microglia isolation methods from postnatal mouse brains and limitations of BV2 cells.

    PubMed

    He, Yingbo; Yao, Xiang; Taylor, Natalie; Bai, Yuchen; Lovenberg, Timothy; Bhattacharya, Anindya

    2018-05-22

    Microglia play key roles in neuron-glia interaction, neuroinflammation, neural repair, and neurotoxicity. Currently, various microglial in vitro models including primary microglia derived from distinct isolation methods and immortalized microglial cell lines are extensively used. However, the diversity of these existing models raises difficulty in parallel comparison across studies since microglia are sensitive to environmental changes, and thus, different models are likely to show widely varied responses to the same stimuli. To better understand the involvement of microglia in pathophysiological situations, it is critical to establish a reliable microglial model system. With postnatal mouse brains, we isolated microglia using three general methods including shaking, mild trypsinization, and CD11b magnetic-associated cell sorting (MACS) and applied RNA sequencing to compare transcriptomes of the isolated cells. Additionally, we generated a genome-wide dataset by RNA sequencing of immortalized BV2 microglial cell line to compare with primary microglia. Furthermore, based on the outcomes of transcriptional analysis, we compared cellular functions between primary microglia and BV2 cells including immune responses to LPS by quantitative RT-PCR and Luminex Multiplex Assay, TGFβ signaling probed by Western blot, and direct migration by chemotaxis assay. We found that although the yield and purity of microglia were comparable among the three isolation methods, mild trypsinization drove microglia in a relatively active state, evidenced by high amount of amoeboid microglia, enhanced expression of microglial activation genes, and suppression of microglial quiescent genes. In contrast, CD11b MACS was the most reliable and consistent method, and microglia isolated by this method maintained a relatively resting state. Transcriptional and functional analyses revealed that as compared to primary microglia, BV2 cells remain most of the immune functions such as responses to LPS but showed limited TGFβ signaling and chemotaxis upon chemoattractant C5a. Collectively, we determined the optimal isolation methods for quiescent microglia and characterized the limitations of BV2 cells as an alternative of primary microglia. Considering transcriptional and functional differences, caution should be taken when extrapolating data from various microglial models. In addition, our RNA sequencing database serves as a valuable resource to provide novel insights for appropriate application of microglia as in vitro models.

  16. Tags, wireless communication systems, tag communication methods, and wireless communications methods

    DOEpatents

    Scott,; Jeff W. , Pratt; Richard, M [Richland, WA

    2006-09-12

    Tags, wireless communication systems, tag communication methods, and wireless communications methods are described. In one aspect, a tag includes a plurality of antennas configured to receive a plurality of first wireless communication signals comprising data from a reader, a plurality of rectifying circuits coupled with. respective individual ones of the antennas and configured to provide rectified signals corresponding to the first wireless communication signals, wherein the rectified signals are combined to produce a composite signal, an adaptive reference circuit configured to vary a reference signal responsive to the composite signal, a comparator coupled with the adaptive reference circuit and the rectifying circuits and configured to compare the composite signal with respect to the reference signal and to output the data responsive to the comparison, and processing circuitry configured to receive the data from the comparator and to process the data.

  17. Prosthetic component segmentation with blur compensation: a fast method for 3D fluoroscopy.

    PubMed

    Tarroni, Giacomo; Tersi, Luca; Corsi, Cristiana; Stagni, Rita

    2012-06-01

    A new method for prosthetic component segmentation from fluoroscopic images is presented. The hybrid approach we propose combines diffusion filtering, region growing and level-set techniques without exploiting any a priori knowledge of the analyzed geometry. The method was evaluated on a synthetic dataset including 270 images of knee and hip prosthesis merged to real fluoroscopic data simulating different conditions of blurring and illumination gradient. The performance of the method was assessed by comparing estimated contours to references using different metrics. Results showed that the segmentation procedure is fast, accurate, independent on the operator as well as on the specific geometrical characteristics of the prosthetic component, and able to compensate for amount of blurring and illumination gradient. Importantly, the method allows a strong reduction of required user interaction time when compared to traditional segmentation techniques. Its effectiveness and robustness in different image conditions, together with simplicity and fast implementation, make this prosthetic component segmentation procedure promising and suitable for multiple clinical applications including assessment of in vivo joint kinematics in a variety of cases.

  18. Comparative analysis of autofocus functions in digital in-line phase-shifting holography.

    PubMed

    Fonseca, Elsa S R; Fiadeiro, Paulo T; Pereira, Manuela; Pinheiro, António

    2016-09-20

    Numerical reconstruction of digital holograms relies on a precise knowledge of the original object position. However, there are a number of relevant applications where this parameter is not known in advance and an efficient autofocusing method is required. This paper addresses the problem of finding optimal focusing methods for use in reconstruction of digital holograms of macroscopic amplitude and phase objects, using digital in-line phase-shifting holography in transmission mode. Fifteen autofocus measures, including spatial-, spectral-, and sparsity-based methods, were evaluated for both synthetic and experimental holograms. The Fresnel transform and the angular spectrum reconstruction methods were compared. Evaluation criteria included unimodality, accuracy, resolution, and computational cost. Autofocusing under angular spectrum propagation tends to perform better with respect to accuracy and unimodality criteria. Phase objects are, generally, more difficult to focus than amplitude objects. The normalized variance, the standard correlation, and the Tenenbaum gradient are the most reliable spatial-based metrics, combining computational efficiency with good accuracy and resolution. A good trade-off between focus performance and computational cost was found for the Fresnelet sparsity method.

  19. The Effect of Laminar Flow on Rotor Hover Performance

    NASA Technical Reports Server (NTRS)

    Overmeyer, Austin D.; Martin, Preston B.

    2017-01-01

    The topic of laminar flow effects on hover performance is introduced with respect to some historical efforts where laminar flow was either measured or attempted. An analysis method is outlined using combined blade element, momentum method coupled to an airfoil analysis method, which includes the full e(sup N) transition model. The analysis results compared well with the measured hover performance including the measured location of transition on both the upper and lower blade surfaces. The analysis method is then used to understand the upper limits of hover efficiency as a function of disk loading. The impact of laminar flow is higher at low disk loading, but significant improvement in terms of power loading appears possible even up to high disk loading approaching 20 ps f. A optimum planform design equation is derived for cases of zero profile drag and finite drag levels. These results are intended to be a guide for design studies and as a benchmark to compare higher fidelity analysis results. The details of the analysis method are given to enable other researchers to use the same approach for comparison to other approaches.

  20. A comparative review of optical surface contamination assessment techniques

    NASA Technical Reports Server (NTRS)

    Heaney, James B.

    1987-01-01

    This paper will review the relative sensitivities and practicalities of the common surface analytical methods that are used to detect and identify unwelcome adsorbants on optical surfaces. The compared methods include visual inspection, simple reflectometry and transmissiometry, ellipsometry, infrared absorption and attenuated total reflectance spectroscopy (ATR), Auger electron spectroscopy (AES), scanning electron microscopy (SEM), secondary ion mass spectrometry (SIMS), and mass accretion determined by quartz crystal microbalance (QCM). The discussion is biased toward those methods that apply optical thin film analytical techniques to spacecraft optical contamination problems. Examples are cited from both ground based and in-orbit experiments.

  1. Comparing direct and iterative equation solvers in a large structural analysis software system

    NASA Technical Reports Server (NTRS)

    Poole, E. L.

    1991-01-01

    Two direct Choleski equation solvers and two iterative preconditioned conjugate gradient (PCG) equation solvers used in a large structural analysis software system are described. The two direct solvers are implementations of the Choleski method for variable-band matrix storage and sparse matrix storage. The two iterative PCG solvers include the Jacobi conjugate gradient method and an incomplete Choleski conjugate gradient method. The performance of the direct and iterative solvers is compared by solving several representative structural analysis problems. Some key factors affecting the performance of the iterative solvers relative to the direct solvers are identified.

  2. COMPARISON OF NONLINEAR DYNAMICS OPTIMIZATION METHODS FOR APS-U

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Y.; Borland, Michael

    Many different objectives and genetic algorithms have been proposed for storage ring nonlinear dynamics performance optimization. These optimization objectives include nonlinear chromaticities and driving/detuning terms, on-momentum and off-momentum dynamic acceptance, chromatic detuning, local momentum acceptance, variation of transverse invariant, Touschek lifetime, etc. In this paper, the effectiveness of several different optimization methods and objectives are compared for the nonlinear beam dynamics optimization of the Advanced Photon Source upgrade (APS-U) lattice. The optimized solutions from these different methods are preliminarily compared in terms of the dynamic acceptance, local momentum acceptance, chromatic detuning, and other performance measures.

  3. [Molecular typing methods for Pasteurella multocida-A review].

    PubMed

    Peng, Zhong; Liang, Wan; Wu, Bin

    2016-10-04

    Pasteurella multocida is an important gram-negative pathogenic bacterium that could infect wide ranges of animals. Humans could also be infected by P. multocida via animal bite or scratching. Current typing methods for P. multocida include serological typing methods and molecular typing methods. Of them, serological typing methods are based on immunological assays, which are too complicated for clinical bacteriological studies. However, the molecular methods including multiple PCRs and multilocus sequence typing (MLST) methods are more suitable for bacteriological studies of P. multocida in clinic, with their simple operation, high efficiency and accurate detection compared to the traditional serological typing methods, they are therefore widely used. In the current review, we briefly describe the molecular typing methods for P. multocida. Our aim is to provide a knowledge-foundation for clinical bacteriological investigation especially the molecular investigation for P. multocida.

  4. Effects of test method and participant musical training on preference ratings of stimuli with different reverberation times.

    PubMed

    Lawless, Martin S; Vigeant, Michelle C

    2017-10-01

    Selecting an appropriate listening test design for concert hall research depends on several factors, including listening test method and participant critical-listening experience. Although expert listeners afford more reliable data, their perceptions may not be broadly representative. The present paper contains two studies that examined the validity and reliability of the data obtained from two listening test methods, a successive and a comparative method, and two types of participants, musicians and non-musicians. Participants rated their overall preference of auralizations generated from eight concert hall conditions with a range of reverberation times (0.0-7.2 s). Study 1, with 34 participants, assessed the two methods. The comparative method yielded similar results and reliability as the successive method. Additionally, the comparative method was rated as less difficult and more preferable. For study 2, an additional 37 participants rated the stimuli using the comparative method only. An analysis of variance of the responses from both studies revealed that musicians are better than non-musicians at discerning their preferences across stimuli. This result was confirmed with a k-means clustering analysis on the entire dataset that revealed five preference groups. Four groups exhibited clear preferences to the stimuli, while the fifth group, predominantly comprising non-musicians, demonstrated no clear preference.

  5. A new smoothing function to introduce long-range electrostatic effects in QM/MM calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fang, Dong; Department of Chemistry, University of Wisconsin, Madison, Wisconsin 53706; Duke, Robert E.

    2015-07-28

    A new method to account for long range electrostatic contributions is proposed and implemented for quantum mechanics/molecular mechanics long range electrostatic correction (QM/MM-LREC) calculations. This method involves the use of the minimum image convention under periodic boundary conditions and a new smoothing function for energies and forces at the cutoff boundary for the Coulomb interactions. Compared to conventional QM/MM calculations without long-range electrostatic corrections, the new method effectively includes effects on the MM environment in the primary image from its replicas in the neighborhood. QM/MM-LREC offers three useful features including the avoidance of calculations in reciprocal space (k-space), with themore » concomitant avoidance of having to reproduce (analytically or approximately) the QM charge density in k-space, and the straightforward availability of analytical Hessians. The new method is tested and compared with results from smooth particle mesh Ewald (PME) for three systems including a box of neat water, a double proton transfer reaction, and the geometry optimization of the critical point structures for the rate limiting step of the DNA dealkylase AlkB. As with other smoothing or shifting functions, relatively large cutoffs are necessary to achieve comparable accuracy with PME. For the double-proton transfer reaction, the use of a 22 Å cutoff shows a close reaction energy profile and geometries of stationary structures with QM/MM-LREC compared to conventional QM/MM with no truncation. Geometry optimization of stationary structures for the hydrogen abstraction step by AlkB shows some differences between QM/MM-LREC and the conventional QM/MM. These differences underscore the necessity of the inclusion of the long-range electrostatic contribution.« less

  6. Errors in reporting on dissolution research: methodological and statistical implications.

    PubMed

    Jasińska-Stroschein, Magdalena; Kurczewska, Urszula; Orszulak-Michalak, Daria

    2017-02-01

    In vitro dissolution testing provides useful information at clinical and preclinical stages of the drug development process. The study includes pharmaceutical papers on dissolution research published in Polish journals between 2010 and 2015. They were analyzed with regard to information provided by authors about chosen methods, performed validation, statistical reporting or assumptions used to properly compare release profiles considering the present guideline documents addressed to dissolution methodology and its validation. Of all the papers included in the study, 23.86% presented at least one set of validation parameters, 63.64% gave the results of the weight uniformity test, 55.68% content determination, 97.73% dissolution testing conditions, and 50% discussed a comparison of release profiles. The assumptions for methods used to compare dissolution profiles were discussed in 6.82% of papers. By means of example analyses, we demonstrate that the outcome can be influenced by the violation of several assumptions or selection of an improper method to compare dissolution profiles. A clearer description of the procedures would undoubtedly increase the quality of papers in this area.

  7. Old-Age Disability and Wealth among Return Mexican Migrants from the United States

    PubMed Central

    Wong, Rebeca; Gonzalez-Gonzalez, Cesar

    2012-01-01

    Objective To examine the old-age consequences of international migration with a focus on disability and wealth from the perspective of the origin country. Methods Analysis sample includes persons aged 60+ from the Mexican Health and Aging Study, a national survey of older-adults in Mexico in 2001. Univariate methods are used to present a comparative profile of return migrants. Multivariate models are estimated for physical disability and wealth. Results Gender differences are profound. Return migrant women are more likely to be disabled while men are wealthier than comparable older adults in Mexico. Discussion Compared to current older adults, younger cohorts of Mexico-U.S. migrants increasingly include women, and more migrants seem likely to remain in the United States rather than return, thus more research will be needed on the old-age conditions of migrants in both countries. PMID:20876848

  8. Measuring digit lengths with 3D digital stereophotogrammetry: A comparison across methods.

    PubMed

    Gremba, Allison; Weinberg, Seth M

    2018-05-09

    We compared digital 3D stereophotogrammetry to more traditional measurement methods (direct anthropometry and 2D scanning) to capture digit lengths and ratios. The length of the second and fourth digits was measured by each method and the second-to-fourth ratio was calculated. For each digit measurement, intraobserver agreement was calculated for each of the three collection methods. Further, measurements from the three methods were compared directly to one another. Agreement statistics included the intraclass correlation coefficient (ICC) and technical error of measurement (TEM). Intraobserver agreement statistics for the digit length measurements were high for all three methods; ICC values exceeded 0.97 and TEM values were below 1 mm. For digit ratio, intraobserver agreement was also acceptable for all methods, with direct anthropometry exhibiting lower agreement (ICC = 0.87) compared to indirect methods. For the comparison across methods, the overall agreement was high for digit length measurements (ICC values ranging from 0.93 to 0.98; TEM values below 2 mm). For digit ratios, high agreement was observed between the two indirect methods (ICC = 0.93), whereas indirect methods showed lower agreement when compared to direct anthropometry (ICC < 0.75). Digit measurements and derived ratios from 3D stereophotogrammetry showed high intraobserver agreement (similar to more traditional methods) suggesting that landmarks could be placed reliably on 3D hand surface images. While digit length measurements were found to be comparable across all three methods, ratios derived from direct anthropometry tended to be higher than those calculated indirectly from 2D or 3D images. © 2018 Wiley Periodicals, Inc.

  9. Comparison of reproducibility of natural head position using two methods.

    PubMed

    Khan, Abdul Rahim; Rajesh, R N G; Dinesh, M R; Sanjay, N; Girish, K S; Venkataraghavan, Karthik

    2012-01-01

    Lateral cephalometric radiographs have become virtually indispensable to orthodontists in the treatment of patients. They are important in orthodontic growth analysis, diagnosis, treatment planning, monitoring of therapy and evaluation of final treatment outcome. The purpose of this study was to evaluate and compare the maximum reproducibility with minimum variation of natural head position using two methods, i.e. the mirror method and the fluid level device method. The study included two sets of 40 lateral cephalograms taken using two methods of obtaining natural head position: (1) The mirror method and (2) fluid level device method, with a time interval of 2 months. Inclusion criteria • Subjects were randomly selected aged between 18 to 26 years Exclusion criteria • History of orthodontic treatment • Any history of respiratory tract problem or chronic mouth breathing • Any congenital deformity • History of traumatically-induced deformity • History of myofacial pain syndrome • Any previous history of head and neck surgery. The result showed that both the methods for obtaining natural head position-the mirror method and fluid level device method were comparable, but maximum reproducibility was more with the fluid level device as shown by the Dahlberg's coefficient and Bland-Altman plot. The minimum variance was seen with the fluid level device method as shown by Precision and Pearson correlation. The mirror method and the fluid level device method used for obtaining natural head position were comparable without any significance, and the fluid level device method was more reproducible and showed less variance when compared to mirror method for obtaining natural head position. Fluid level device method was more reproducible and shows less variance when compared to mirror method for obtaining natural head position.

  10. Statistical methods for analysis of radiation effects with tumor and dose location-specific information with application to the WECARE study of asynchronous contralateral breast cancer

    PubMed Central

    Langholz, Bryan; Thomas, Duncan C.; Stovall, Marilyn; Smith, Susan A.; Boice, John D.; Shore, Roy E.; Bernstein, Leslie; Lynch, Charles F.; Zhang, Xinbo; Bernstein, Jonine L.

    2009-01-01

    Summary Methods for the analysis of individually matched case-control studies with location-specific radiation dose and tumor location information are described. These include likelihood methods for analyses that just use cases with precise location of tumor information and methods that also include cases with imprecise tumor location information. The theory establishes that each of these likelihood based methods estimates the same radiation rate ratio parameters, within the context of the appropriate model for location and subject level covariate effects. The underlying assumptions are characterized and the potential strengths and limitations of each method are described. The methods are illustrated and compared using the WECARE study of radiation and asynchronous contralateral breast cancer. PMID:18647297

  11. Radio-nuclide mixture identification using medium energy resolution detectors

    DOEpatents

    Nelson, Karl Einar

    2013-09-17

    According to one embodiment, a method for identifying radio-nuclides includes receiving spectral data, extracting a feature set from the spectral data comparable to a plurality of templates in a template library, and using a branch and bound method to determine a probable template match based on the feature set and templates in the template library. In another embodiment, a device for identifying unknown radio-nuclides includes a processor, a multi-channel analyzer, and a memory operatively coupled to the processor, the memory having computer readable code stored thereon. The computer readable code is configured, when executed by the processor, to receive spectral data, to extract a feature set from the spectral data comparable to a plurality of templates in a template library, and to use a branch and bound method to determine a probable template match based on the feature set and templates in the template library.

  12. Whole-genome multiple displacement amplification from single cells.

    PubMed

    Spits, Claudia; Le Caignec, Cédric; De Rycke, Martine; Van Haute, Lindsey; Van Steirteghem, André; Liebaers, Inge; Sermon, Karen

    2006-01-01

    Multiple displacement amplification (MDA) is a recently described method of whole-genome amplification (WGA) that has proven efficient in the amplification of small amounts of DNA, including DNA from single cells. Compared with PCR-based WGA methods, MDA generates DNA with a higher molecular weight and shows better genome coverage. This protocol was developed for preimplantation genetic diagnosis, and details a method for performing single-cell MDA using the phi29 DNA polymerase. It can also be useful for the amplification of other minute quantities of DNA, such as from forensic material or microdissected tissue. The protocol includes the collection and lysis of single cells, and all materials and steps involved in the MDA reaction. The whole procedure takes 3 h and generates 1-2 microg of DNA from a single cell, which is suitable for multiple downstream applications, such as sequencing, short tandem repeat analysis or array comparative genomic hybridization.

  13. Effects of problem-based learning in Chinese radiology education

    PubMed Central

    Zhang, Song; Xu, Jiancheng; Wang, Hongwei; Zhang, Dong; Zhang, Qichuan; Zou, Liguang

    2018-01-01

    Abstract Background: In recent years, the problem-based learning (PBL) teaching method has been extensively applied as an experimental educational method in Chinese radiology education. However, the results of individual studies were inconsistent and inconclusive. A meta-analysis was performed to evaluate the effects of PBL on radiology education in China. Methods: Databases of Chinese and English languages were searched from inception up to November 2017. The standard mean difference (SMD) with its 95% confidence interval (95% CI) was used to determine the over effects of PBL compared with the traditional teaching method. Results: Seventeen studies involving 1487 participants were included in this meta-analysis. Of them, 16 studies provided sufficient data for the pooled analysis and showed that PBL teaching method had a positive effect on achieving higher theoretical scores compared with the traditional teaching method (SMD = 1.20, 95% CI [0.68, 1.71]). Thirteen studies provided sufficient data on skill scores, and a significant difference in favor of PBL was also observed (SMD = 2.10, 95% CI [1.38, 2.83]). Questionnaire surveys were applied in most of the included studies and indicated positive effects of PBL on students’ learning interest, scope of knowledge, team spirit, and oral expression. Conclusion: The result shows that PBL appears to be more effective on radiology education than traditional teaching method in China. However, the heterogeneity of the included studies cannot be neglected. Further well-designed studies about this topic are needed to confirm the above findings. PMID:29489669

  14. Research in disaster settings: a systematic qualitative review of ethical guidelines.

    PubMed

    Mezinska, Signe; Kakuk, Péter; Mijaljica, Goran; Waligóra, Marcin; O'Mathúna, Dónal P

    2016-10-21

    Conducting research during or in the aftermath of disasters poses many specific practical and ethical challenges. This is particularly the case with research involving human subjects. The extraordinary circumstances of research conducted in disaster settings require appropriate regulations to ensure the protection of human participants. The goal of this study is to systematically and qualitatively review the existing ethical guidelines for disaster research by using the constant comparative method (CCM). We performed a systematic qualitative review of disaster research ethics guidelines to collect and compare existing regulations. Guidelines were identified by a three-tiered search strategy: 1) searching databases (PubMed and Google Scholar), 2) an Internet search (Google), and 3) a search of the references in the included documents from the first two searches. We used the constant comparative method (CCM) for analysis of included guidelines. Fourteen full text guidelines were included for analysis. The included guidelines covered the period 2000-2014. Qualitative analysis of the included guidelines revealed two core themes: vulnerability and research ethics committee review. Within each of the two core themes, various categories and subcategories were identified. Some concepts and terms identified in analyzed guidelines are used in an inconsistent manner and applied in different contexts. Conceptual clarity is needed in this area as well as empirical evidence to support the statements and requirements included in analyzed guidelines.

  15. Applications of propensity score methods in observational comparative effectiveness and safety research: where have we come and where should we go?

    PubMed

    Borah, Bijan J; Moriarty, James P; Crown, William H; Doshi, Jalpa A

    2014-01-01

    Propensity score (PS) methods have proliferated in recent years in observational studies in general and in observational comparative effectiveness research (CER) in particular. PS methods are an important set of tools for estimating treatment effects in observational studies, enabling adjustment for measured confounders in an easy-to-understand and transparent way. This article demonstrates how PS methods have been used to address specific CER questions from 2001 through to 2012 by identifying six impactful studies from this period. This article also discusses areas for improvement, including data infrastructure, and a unified set of guidelines in terms of PS implementation and reporting, which will boost confidence in evidence generated through observational CER using PS methods.

  16. 26 CFR 1.482-5 - Comparable profits method.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... operating profit represents a return for the investment of resources and assumption of risks. Therefore... from a sufficient number of years of data to reasonably measure returns that accrue to uncontrolled... party and uncontrolled comparables include the following— (i) Rate of return on capital employed. The...

  17. Core ADHD Symptom Improvement with Atomoxetine versus Methylphenidate: A Direct Comparison Meta-Analysis

    ERIC Educational Resources Information Center

    Hazell, Philip L.; Kohn, Michael R.; Dickson, Ruth; Walton, Richard J.; Granger, Renee E.; van Wyk, Gregory W.

    2011-01-01

    Objective: Previous studies comparing atomoxetine and methylphenidate to treat ADHD symptoms have been equivocal. This noninferiority meta-analysis compared core ADHD symptom response between atomoxetine and methylphenidate in children and adolescents. Method: Selection criteria included randomized, controlled design; duration 6 weeks; and…

  18. Double versus single cervical cerclage for patients with recurrent pregnancy loss: a randomized clinical trial.

    PubMed

    Zolghadri, Jaleh; Younesi, Masoumeh; Asadi, Nasrin; Khosravi, Dezire; Behdin, Shabnam; Tavana, Zohre; Ghaffarpasand, Fariborz

    2014-02-01

    To compare the effectiveness of the double cervical cerclage method versus the single method in women with recurrent second-trimester delivery. In this randomized clinical trial, we included 33 singleton pregnancies suffering from recurrent second-trimester pregnancy loss (≥2 consecutive fetal loss during second-trimester or with a history of unsuccessful procedures utilizing the McDonald method), due to cervical incompetence. Patients were randomly assigned to undergo either the classic McDonald method (n = 14) or the double cerclage method (n = 19). The successful pregnancy rate and gestational age at delivery was also compared between the two groups. The two study groups were comparable regarding their baseline characteristics. The successful pregnancy rate did not differ significantly between those who underwent the double cerclage method or the classic McDonald cerclage method (100% vs 85.7%; P = 0.172). In the same way, the preterm delivery rate (<34 weeks of gestation) was comparable between the two study groups (10.5% vs 35.7%; P = 0.106). Those undergoing the double cerclage method had longer gestational duration (37.2 ± 2.6 vs 34.3 ± 3.8 weeks; P = 0.016). The double cervical cerclage method seems to provide better cervical support, as compared with the classic McDonald cerclage method, in those suffering from recurrent pregnancy loss, due to cervical incompetence. © 2013 The Authors. Journal of Obstetrics and Gynaecology Research © 2013 Japan Society of Obstetrics and Gynecology.

  19. Consensus of recommendations guiding comparative effectiveness research methods.

    PubMed

    Morton, Jacob B; McConeghy, Robert; Heinrich, Kirstin; Gatto, Nicolle M; Caffrey, Aisling R

    2016-12-01

    Because of an increasing demand for quality comparative effectiveness research (CER), methods guidance documents have been published, such as those from the Agency for Healthcare Research and Quality (AHRQ) and the Patient-Centered Outcomes Research Institute (PCORI). Our objective was to identify CER methods guidance documents and compare them to produce a summary of important recommendations which could serve as a consensus of CER method recommendations. We conducted a systematic literature review to identify CER methods guidance documents published through 2014. Identified documents were analyzed for methods guidance recommendations. Individual recommendations were categorized to determine the degree of overlap. We identified nine methods guidance documents, which contained a total of 312 recommendations, 97% of which were present in two or more documents. All nine documents recommended transparency and adaptation for relevant stakeholders in the interpretation and dissemination of results. Other frequently shared CER methods recommendations included: study design and operational definitions should be developed a priori and allow for replication (n = 8 documents); focus on areas with gaps in current clinical knowledge that are relevant to decision-makers (n = 7); validity of measures, instruments, and data should be assessed and discussed (n = 7); outcomes, including benefits and harms, should be clinically meaningful, and objectively measured (n = 7). Assessment for and strategies to minimize bias (n = 6 documents), confounding (n = 6), and heterogeneity (n = 4) were also commonly shared recommendations between documents. We offer a field-consensus guide based on nine CER methods guidance documents that will aid researchers in designing CER studies and applying CER methods. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  20. The HIV care cascade: a systematic review of data sources, methodology and comparability.

    PubMed

    Medland, Nicholas A; McMahon, James H; Chow, Eric P F; Elliott, Julian H; Hoy, Jennifer F; Fairley, Christopher K

    2015-01-01

    The cascade of HIV diagnosis, care and treatment (HIV care cascade) is increasingly used to direct and evaluate interventions to increase population antiretroviral therapy (ART) coverage, a key component of treatment as prevention. The ability to compare cascades over time, sub-population, jurisdiction or country is important. However, differences in data sources and methodology used to construct the HIV care cascade might limit its comparability and ultimately its utility. Our aim was to review systematically the different methods used to estimate and report the HIV care cascade and their comparability. A search of published and unpublished literature through March 2015 was conducted. Cascades that reported the continuum of care from diagnosis to virological suppression in a demographically definable population were included. Data sources and methods of measurement or estimation were extracted. We defined the most comparable cascade elements as those that directly measured diagnosis or care from a population-based data set. Thirteen reports were included after screening 1631 records. The undiagnosed HIV-infected population was reported in seven cascades, each of which used different data sets and methods and could not be considered to be comparable. All 13 used mandatory HIV diagnosis notification systems to measure the diagnosed population. Population-based data sets, derived from clinical data or mandatory reporting of CD4 cell counts and viral load tests from all individuals, were used in 6 of 12 cascades reporting linkage, 6 of 13 reporting retention, 3 of 11 reporting ART and 6 of 13 cascades reporting virological suppression. Cascades with access to population-based data sets were able to directly measure cascade elements and are therefore comparable over time, place and sub-population. Other data sources and methods are less comparable. To ensure comparability, countries wishing to accurately measure the cascade should utilize complete population-based data sets from clinical data from elements of a centralized healthcare setting, where available, or mandatory CD4 cell count and viral load test result reporting. Additionally, virological suppression should be presented both as percentage of diagnosed and percentage of estimated total HIV-infected population, until methods to calculate the latter have been standardized.

  1. SU-D-BRB-01: A Comparison of Learning Methods for Knowledge Based Dose Prediction for Coplanar and Non-Coplanar Liver Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tran, A; Ruan, D; Woods, K

    Purpose: The predictive power of knowledge based planning (KBP) has considerable potential in the development of automated treatment planning. Here, we examine the predictive capabilities and accuracy of previously reported KBP methods, as well as an artificial neural networks (ANN) method. Furthermore, we compare the predictive accuracy of these methods on coplanar volumetric-modulated arc therapy (VMAT) and non-coplanar 4π radiotherapy. Methods: 30 liver SBRT patients previously treated using coplanar VMAT were selected for this study. The patients were re-planned using 4π radiotherapy, which involves 20 optimally selected non-coplanar IMRT fields. ANNs were used to incorporate enhanced geometric information including livermore » and PTV size, prescription dose, patient girth, and proximity to beams. The performance of ANN was compared to three methods from statistical voxel dose learning (SVDL), wherein the doses of voxels sharing the same distance to the PTV are approximated by either taking the median of the distribution, non-parametric fitting, or skew-normal fitting. These three methods were shown to be capable of predicting DVH, but only median approximation can predict 3D dose. Prediction methods were tested using leave-one-out cross-validation tests and evaluated using residual sum of squares (RSS) for DVH and 3D dose predictions. Results: DVH prediction using non-parametric fitting had the lowest average RSS with 0.1176(4π) and 0.1633(VMAT), compared to 0.4879(4π) and 1.8744(VMAT) RSS for ANN. 3D dose prediction with median approximation had lower RSS with 12.02(4π) and 29.22(VMAT), compared to 27.95(4π) and 130.9(VMAT) for ANN. Conclusion: Paradoxically, although the ANNs included geometric features in addition to the distances to the PTV, it did not perform better in predicting DVH or 3D dose compared to simpler, faster methods based on the distances alone. The study further confirms that the prediction of 4π non-coplanar plans were more accurate than VMAT. NIH R43CA183390 and R01CA188300.« less

  2. 34 CFR Appendix B to Part 403 - Examples for 34 CFR 403.194-Comparability Requirements

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... TECHNOLOGY EDUCATION PROGRAM Pt. 403, App. B Appendix B to Part 403—Examples for 34 CFR 403.194—Comparability Requirements Methods by which a local educational agency can demonstrate its compliance with the comparability requirements in 34 CFR 403.194(a) include the following: Example 1: The local educational agency files with the...

  3. Maximizing RNA yield from archival renal tumors and optimizing gene expression analysis.

    PubMed

    Glenn, Sean T; Head, Karen L; Teh, Bin T; Gross, Kenneth W; Kim, Hyung L

    2010-01-01

    Formalin-fixed, paraffin-embedded tissues are widely available for gene expression analysis using TaqMan PCR. Five methods, including 4 commercial kits, for recovering RNA from paraffin-embedded renal tumor tissue were compared. The MasterPure kit from Epicentre produced the highest RNA yield. However, the difference in RNA yield between the kit from Epicenter and Invitrogen's TRIzol method was not significant. Using the top 3 RNA isolation methods, the manufacturers' protocols were modified to include an overnight Proteinase K digestion. Overnight protein digestion resulted in a significant increase in RNA yield. To optimize the reverse transcription reaction, conventional reverse transcription with random oligonucleotide primers was compared to reverse transcription using primers specific for genes of interest. Reverse transcription using gene-specific primers significantly increased the quantity of cDNA detectable by TaqMan PCR. Therefore, expression profiling of formalin-fixed, paraffin-embedded tissue using TaqMan qPCR can be optimized by using the MasterPure RNA isolation kit modified to include an overnight Proteinase K digestion and gene-specific primers during the reverse transcription.

  4. The opportunities and challenges of large-scale molecular approaches to songbird neurobiology

    PubMed Central

    Mello, C.V.; Clayton, D.F.

    2014-01-01

    High-through put methods for analyzing genome structure and function are having a large impact in song-bird neurobiology. Methods include genome sequencing and annotation, comparative genomics, DNA microarrays and transcriptomics, and the development of a brain atlas of gene expression. Key emerging findings include the identification of complex transcriptional programs active during singing, the robust brain expression of non-coding RNAs, evidence of profound variations in gene expression across brain regions, and the identification of molecular specializations within song production and learning circuits. Current challenges include the statistical analysis of large datasets, effective genome curations, the efficient localization of gene expression changes to specific neuronal circuits and cells, and the dissection of behavioral and environmental factors that influence brain gene expression. The field requires efficient methods for comparisons with organisms like chicken, which offer important anatomical, functional and behavioral contrasts. As sequencing costs plummet, opportunities emerge for comparative approaches that may help reveal evolutionary transitions contributing to vocal learning, social behavior and other properties that make songbirds such compelling research subjects. PMID:25280907

  5. Lexicon generation methods, lexicon generation devices, and lexicon generation articles of manufacture

    DOEpatents

    Carter, Richard J [Richland, WA; McCall, Jonathon D [West Richland, WA; Whitney, Paul D [Richland, WA; Gregory, Michelle L [Richland, WA; Turner, Alan E [Kennewick, WA; Hetzler, Elizabeth G [Kennewick, WA; White, Amanda M [Kennewick, WA; Posse, Christian [Seattle, WA; Nakamura, Grant C [Kennewick, WA

    2010-10-26

    Lexicon generation methods, computer implemented lexicon editing methods, lexicon generation devices, lexicon editors, and articles of manufacture are described according to some aspects. In one aspect, a lexicon generation method includes providing a seed vector indicative of occurrences of a plurality of seed terms within a plurality of text items, providing a plurality of content vectors indicative of occurrences of respective ones of a plurality of content terms within the text items, comparing individual ones of the content vectors with respect to the seed vector, and responsive to the comparing, selecting at least one of the content terms as a term of a lexicon usable in sentiment analysis of text.

  6. A comparative study on methods of improving SCR for ship detection in SAR image

    NASA Astrophysics Data System (ADS)

    Lang, Haitao; Shi, Hongji; Tao, Yunhong; Ma, Li

    2017-10-01

    Knowledge about ship positions plays a critical role in a wide range of maritime applications. To improve the performance of ship detector in SAR image, an effective strategy is improving the signal-to-clutter ratio (SCR) before conducting detection. In this paper, we present a comparative study on methods of improving SCR, including power-law scaling (PLS), max-mean and max-median filter (MMF1 and MMF2), method of wavelet transform (TWT), traditional SPAN detector, reflection symmetric metric (RSM), scattering mechanism metric (SMM). The ability of SCR improvement to SAR image and ship detection performance associated with cell- averaging CFAR (CA-CFAR) of different methods are evaluated on two real SAR data.

  7. The Effects of Including Observed Means or Latent Means as Covariates in Multilevel Models for Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Aydin, Burak; Leite, Walter L.; Algina, James

    2016-01-01

    We investigated methods of including covariates in two-level models for cluster randomized trials to increase power to detect the treatment effect. We compared multilevel models that included either an observed cluster mean or a latent cluster mean as a covariate, as well as the effect of including Level 1 deviation scores in the model. A Monte…

  8. Cascading pressure reactor and method for solar-thermochemical reactions

    DOEpatents

    Ermanoski, Ivan

    2017-11-14

    Reactors and methods for solar thermochemical reactions are disclosed. The reactors and methods include a cascade of reduction chambers at successively lower pressures that leads to over an order of magnitude pressure decrease compared to a single-chambered design. The resulting efficiency gains are substantial, and represent an important step toward practical and efficient solar fuel production on a large scale.

  9. Comparison of Computed Tomography and Chest Radiography in the Detection of Rib Fractures in Abused Infants

    ERIC Educational Resources Information Center

    Wootton-Gorges, Sandra L.; Stein-Wexler, Rebecca; Walton, John W.; Rosas, Angela J.; Coulter, Kevin P.; Rogers, Kristen K.

    2008-01-01

    Purpose: Chest radiographs (CXR) are the standard method for evaluating rib fractures in abused infants. Computed tomography (CT) is a sensitive method to detect rib fractures. The purpose of this study was to compare CT and CXR in the evaluation of rib fractures in abused infants. Methods: This retrospective study included all 12 abused infants…

  10. Teaching Computer Literacy in an Elementary School: A Comparison of Two Methods Using Microcomputers. Report No. 81:18.

    ERIC Educational Resources Information Center

    Nordman, R.; Parker, J.

    This report compares two methods of teaching BASIC programming used to develop computer literacy among children in grades three through seven in British Columbia. Phase one of the project was designed to instruct children in grades five to seven on the arithmetic operations of writing simple BASIC programs. Instructional methods included using job…

  11. Comparing transformation methods for DNA microarray data

    PubMed Central

    Thygesen, Helene H; Zwinderman, Aeilko H

    2004-01-01

    Background When DNA microarray data are used for gene clustering, genotype/phenotype correlation studies, or tissue classification the signal intensities are usually transformed and normalized in several steps in order to improve comparability and signal/noise ratio. These steps may include subtraction of an estimated background signal, subtracting the reference signal, smoothing (to account for nonlinear measurement effects), and more. Different authors use different approaches, and it is generally not clear to users which method they should prefer. Results We used the ratio between biological variance and measurement variance (which is an F-like statistic) as a quality measure for transformation methods, and we demonstrate a method for maximizing that variance ratio on real data. We explore a number of transformations issues, including Box-Cox transformation, baseline shift, partial subtraction of the log-reference signal and smoothing. It appears that the optimal choice of parameters for the transformation methods depends on the data. Further, the behavior of the variance ratio, under the null hypothesis of zero biological variance, appears to depend on the choice of parameters. Conclusions The use of replicates in microarray experiments is important. Adjustment for the null-hypothesis behavior of the variance ratio is critical to the selection of transformation method. PMID:15202953

  12. Comparing transformation methods for DNA microarray data.

    PubMed

    Thygesen, Helene H; Zwinderman, Aeilko H

    2004-06-17

    When DNA microarray data are used for gene clustering, genotype/phenotype correlation studies, or tissue classification the signal intensities are usually transformed and normalized in several steps in order to improve comparability and signal/noise ratio. These steps may include subtraction of an estimated background signal, subtracting the reference signal, smoothing (to account for nonlinear measurement effects), and more. Different authors use different approaches, and it is generally not clear to users which method they should prefer. We used the ratio between biological variance and measurement variance (which is an F-like statistic) as a quality measure for transformation methods, and we demonstrate a method for maximizing that variance ratio on real data. We explore a number of transformations issues, including Box-Cox transformation, baseline shift, partial subtraction of the log-reference signal and smoothing. It appears that the optimal choice of parameters for the transformation methods depends on the data. Further, the behavior of the variance ratio, under the null hypothesis of zero biological variance, appears to depend on the choice of parameters. The use of replicates in microarray experiments is important. Adjustment for the null-hypothesis behavior of the variance ratio is critical to the selection of transformation method.

  13. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  14. Application of reiteration of Hankel singular value decomposition in quality control

    NASA Astrophysics Data System (ADS)

    Staniszewski, Michał; Skorupa, Agnieszka; Boguszewicz, Łukasz; Michalczuk, Agnieszka; Wereszczyński, Kamil; Wicher, Magdalena; Konopka, Marek; Sokół, Maria; Polański, Andrzej

    2017-07-01

    Medical centres are obliged to store past medical records, including the results of quality assurance (QA) tests of the medical equipment, which is especially useful in checking reproducibility of medical devices and procedures. Analysis of multivariate time series is an important part of quality control of NMR data. In this work we proposean anomaly detection tool based on Reiteration of Hankel Singular Value Decomposition method. The presented method was compared with external software and authors obtained comparable results.

  15. Exhaust gas bypass valve control for thermoelectric generator

    DOEpatents

    Reynolds, Michael G; Yang, Jihui; Meisner, Greogry P.; Stabler, Francis R.; De Bock, Hendrik Pieter Jacobus; Anderson, Todd Alan

    2012-09-04

    A method of controlling engine exhaust flow through at least one of an exhaust bypass and a thermoelectric device via a bypass valve is provided. The method includes: determining a mass flow of exhaust exiting an engine; determining a desired exhaust pressure based on the mass flow of exhaust; comparing the desired exhaust pressure to a determined exhaust pressure; and determining a bypass valve control value based on the comparing, wherein the bypass valve control value is used to control the bypass valve.

  16. [Meta-analysis of needle-knife treatment on cervical spondylosis].

    PubMed

    Kan, Li-Li; Wang, Hai-Dong; Liu, An-Guo

    2013-11-01

    To assess the efficacy of cervical spondylosis by needle-knife treatment according to the correlated literature of RCT,to compare advantages of needle-knife treatment. Randomized Controlled Trials about needle-knife treatment of cervical spondylosis were indexed from Chinese HowNet (CNKI) and Wanfang (WF) from 2000 to 2012, then were analyzed the efficacy by Review Manager 5.1 software. A total of 13 RCT literatures and 1 419 patients were included. The methods of included studies were poor in quality evaluation because of large sample and multi-center RCT studies was lacked, randomization method was not accurate enough, diagnostic criteria and efficacy evaluation were various, only four studies described long-term efficacy, most of the literature didn't describe the adverse event and fall off,all studies did not use the blind method. The Meta analysis outcome showed overall efficiency of needle-knife therapy was better than acupuncture and traction. Needle-knife therapy compared with Acupuncture, the total RR = 0.19, 95% confidence interval was (0.15, 0.24), P < 0.000.01. Compared with traction therapy the total RR = 1.30, 95% confidence intervalwas (1.18,1.42), P < 0.00001. Compared with acupuncture therapy,the overall effectiveness of needle-knife therapy is higher;compared with traction therapy, although,needle-knife therapy has a high overall effectiveness, but because of the loss of total sample size, the outcome RCT researches to confirm.

  17. Renewable energy delivery systems and methods

    DOEpatents

    Walker, Howard Andrew

    2013-12-10

    A system, method and/or apparatus for the delivery of energy at a site, at least a portion of the energy being delivered by at least one or more of a plurality of renewable energy technologies, the system and method including calculating the load required by the site for the period; calculating the amount of renewable energy for the period, including obtaining a capacity and a percentage of the period for the renewable energy to be delivered; comparing the total load to the renewable energy available; and, implementing one or both of additional and alternative renewable energy sources for delivery of energy to the site.

  18. Interactive Video Usage on Autism Spectrum Disorder Training in Medical Education

    ERIC Educational Resources Information Center

    Taslibeyaz, Elif; Dursun, Onur Burak; Karaman, Selcuk

    2017-01-01

    This study aimed to compare the effects of interactive and non-interactive videos concerning the autism spectrum disorder on medical students' achievement. It also evaluated the relation between the interactive videos' interactivity and the students' decision-making process. It used multiple methods, including quantitative and qualitative methods.…

  19. Reflectance spectroscopy: quantitative analysis techniques for remote sensing applications.

    USGS Publications Warehouse

    Clark, R.N.; Roush, T.L.

    1984-01-01

    Several methods for the analysis of remotely sensed reflectance data are compared, including empirical methods and scattering theories, both of which are important for solving remote sensing problems. The concept of the photon mean path length and the implications for use in modeling reflectance spectra are presented.-from Authors

  20. COMPARATIVE PERFORMANCE OF SIX DIFFERENT BENTHIC MACROINVERTEBRATE SAMPLING METHODS FOR RIVERINE ECOSYSTEMS

    EPA Science Inventory

    At each of 60 sites, we collected benthic macroinvertebrates using six different protocols (including the EMAP methods for non-wadeable rivers) and physical habitat data using the USEPA-EMAP-SW protocols for non-wadeable rivers. We used PCA with physical habitat data and DCA wit...

  1. College Quality and Early Adult Outcomes

    ERIC Educational Resources Information Center

    Long, Mark C.

    2008-01-01

    This paper estimates the effects of various college qualities on several early adult outcomes, using panel data from the National Education Longitudinal Study. I compare the results using ordinary least squares with three alternative methods of estimation, including instrumental variables, and the methods used by Dale and Krueger [(2002).…

  2. First Language Composition Pedagogy in the Second Language Classroom: A Reassesment.

    ERIC Educational Resources Information Center

    Ross, Steven; And Others

    1988-01-01

    Evaluated the effectiveness of using native language (Japanese) based writing methods in English as a second language (ESL) classrooms. The methods compared included sentence combining and structural grammar instruction with journal writing, controlled composition writing with feedback on surface error, and peer reformulation. Journal writing, but…

  3. E-Commerce New Venture Performance: How Funding Impacts Culture.

    ERIC Educational Resources Information Center

    Hamilton, R. H.

    2001-01-01

    Explores the three primary methods of funding for e-commerce startups and the impact that funding criteria have had on the resulting organizational cultures. Highlights include self-funded firms; venture capital funding; corporate funding; and a table that compares the three types, including examples. (LRW)

  4. A comparative uncertainty study of the calibration of macrolide antibiotic reference standards using quantitative nuclear magnetic resonance and mass balance methods.

    PubMed

    Liu, Shu-Yu; Hu, Chang-Qin

    2007-10-17

    This study introduces the general method of quantitative nuclear magnetic resonance (qNMR) for the calibration of reference standards of macrolide antibiotics. Several qNMR experimental conditions were optimized including delay, which is an important parameter of quantification. Three kinds of macrolide antibiotics were used to validate the accuracy of the qNMR method by comparison with the results obtained by the high performance liquid chromatography (HPLC) method. The purities of five common reference standards of macrolide antibiotics were measured by the 1H qNMR method and the mass balance method, respectively. The analysis results of the two methods were compared. The qNMR is quick and simple to use. In a new medicine research and development process, qNMR provides a new and reliable method for purity analysis of the reference standard.

  5. Remote sensing change detection methods to track deforestation and growth in threatened rainforests in Madre de Dios, Peru

    USGS Publications Warehouse

    Shermeyer, Jacob S.; Haack, Barry N.

    2015-01-01

    Two forestry-change detection methods are described, compared, and contrasted for estimating deforestation and growth in threatened forests in southern Peru from 2000 to 2010. The methods used in this study rely on freely available data, including atmospherically corrected Landsat 5 Thematic Mapper and Moderate Resolution Imaging Spectroradiometer (MODIS) vegetation continuous fields (VCF). The two methods include a conventional supervised signature extraction method and a unique self-calibrating method called MODIS VCF guided forest/nonforest (FNF) masking. The process chain for each of these methods includes a threshold classification of MODIS VCF, training data or signature extraction, signature evaluation, k-nearest neighbor classification, analyst-guided reclassification, and postclassification image differencing to generate forest change maps. Comparisons of all methods were based on an accuracy assessment using 500 validation pixels. Results of this accuracy assessment indicate that FNF masking had a 5% higher overall accuracy and was superior to conventional supervised classification when estimating forest change. Both methods succeeded in classifying persistently forested and nonforested areas, and both had limitations when classifying forest change.

  6. Biases and Power for Groups Comparison on Subjective Health Measurements

    PubMed Central

    Hamel, Jean-François; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Roquelaure, Yves; Sébille, Véronique

    2012-01-01

    Subjective health measurements are increasingly used in clinical research, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: so-called classical test theory (CTT), relying on observed scores and models coming from Item Response Theory (IRT) relying on a response model relating the items responses to a latent parameter, often called latent trait. Whether IRT or CTT would be the most appropriate method to compare two independent groups of patients on a patient reported outcomes measurement remains unknown and was investigated using simulations. For CTT-based analyses, groups comparison was performed using t-test on the scores. For IRT-based analyses, several methods were compared, according to whether the Rasch model was considered with random effects or with fixed effects, and the group effect was included as a covariate or not. Individual latent traits values were estimated using either a deterministic method or by stochastic approaches. Latent traits were then compared with a t-test. Finally, a two-steps method was performed to compare the latent trait distributions, and a Wald test was performed to test the group effect in the Rasch model including group covariates. The only unbiased IRT-based method was the group covariate Wald’s test, performed on the random effects Rasch model. This model displayed the highest observed power, which was similar to the power using the score t-test. These results need to be extended to the case frequently encountered in practice where data are missing and possibly informative. PMID:23115620

  7. A modeling approach to compare ΣPCB concentrations between congener-specific analyses

    USGS Publications Warehouse

    Gibson, Polly P.; Mills, Marc A.; Kraus, Johanna M.; Walters, David M.

    2017-01-01

    Changes in analytical methods over time pose problems for assessing long-term trends in environmental contamination by polychlorinated biphenyls (PCBs). Congener-specific analyses vary widely in the number and identity of the 209 distinct PCB chemical configurations (congeners) that are quantified, leading to inconsistencies among summed PCB concentrations (ΣPCB) reported by different studies. Here we present a modeling approach using linear regression to compare ΣPCB concentrations derived from different congener-specific analyses measuring different co-eluting groups. The approach can be used to develop a specific conversion model between any two sets of congener-specific analytical data from similar samples (similar matrix and geographic origin). We demonstrate the method by developing a conversion model for an example data set that includes data from two different analytical methods, a low resolution method quantifying 119 congeners and a high resolution method quantifying all 209 congeners. We used the model to show that the 119-congener set captured most (93%) of the total PCB concentration (i.e., Σ209PCB) in sediment and biological samples. ΣPCB concentrations estimated using the model closely matched measured values (mean relative percent difference = 9.6). General applications of the modeling approach include (a) generating comparable ΣPCB concentrations for samples that were analyzed for different congener sets; and (b) estimating the proportional contribution of different congener sets to ΣPCB. This approach may be especially valuable for enabling comparison of long-term remediation monitoring results even as analytical methods change over time. 

  8. Extension of a hybrid particle-continuum method for a mixture of chemical species

    NASA Astrophysics Data System (ADS)

    Verhoff, Ashley M.; Boyd, Iain D.

    2012-11-01

    Due to the physical accuracy and numerical efficiency achieved by analyzing transitional, hypersonic flow fields with hybrid particle-continuum methods, this paper describes a Modular Particle-Continuum (MPC) method and its extension to include multiple chemical species. Considerations that are specific to a hybrid approach for simulating gas mixtures are addressed, including a discussion of the Chapman-Enskog velocity distribution function (VDF) for near-equilibrium flows, and consistent viscosity models for the individual CFD and DSMC modules of the MPC method. Representative results for a hypersonic blunt-body flow are then presented, where the flow field properties, surface properties, and computational performance are compared for simulations employing full CFD, full DSMC, and the MPC method.

  9. Including mixed methods research in systematic reviews: Examples from qualitative syntheses in TB and malaria control

    PubMed Central

    2012-01-01

    Background Health policy makers now have access to a greater number and variety of systematic reviews to inform different stages in the policy making process, including reviews of qualitative research. The inclusion of mixed methods studies in systematic reviews is increasing, but these studies pose particular challenges to methods of review. This article examines the quality of the reporting of mixed methods and qualitative-only studies. Methods We used two completed systematic reviews to generate a sample of qualitative studies and mixed method studies in order to make an assessment of how the quality of reporting and rigor of qualitative-only studies compares with that of mixed-methods studies. Results Overall, the reporting of qualitative studies in our sample was consistently better when compared with the reporting of mixed methods studies. We found that mixed methods studies are less likely to provide a description of the research conduct or qualitative data analysis procedures and less likely to be judged credible or provide rich data and thick description compared with standalone qualitative studies. Our time-related analysis shows that for both types of study, papers published since 2003 are more likely to report on the study context, describe analysis procedures, and be judged credible and provide rich data. However, the reporting of other aspects of research conduct (i.e. descriptions of the research question, the sampling strategy, and data collection methods) in mixed methods studies does not appear to have improved over time. Conclusions Mixed methods research makes an important contribution to health research in general, and could make a more substantial contribution to systematic reviews. Through our careful analysis of the quality of reporting of mixed methods and qualitative-only research, we have identified areas that deserve more attention in the conduct and reporting of mixed methods research. PMID:22545681

  10. The Wisconsin immunization registry experience: comparing real-time and batched file submissions from health care providers.

    PubMed

    Schauer, Stephanie L; Maerz, Thomas R; Verdon, Matthew J; Hopfensperger, Daniel J; Davis, Jeffrey P

    2014-06-01

    The Wisconsin Immunization Registry is a confidential, web-based system used since 1999 as a centralized repository of immunization information for Wisconsin residents. Provide evidence based on Registry experiences with electronic data exchange, comparing the benefits and drawbacks of using the Health Level 7 standard, including the option for real time data exchange vs the flat file method. For data regarding vaccinations received by children aged 4 months through 6 years with Wisconsin addresses that were submitted to the Registry during 2010 and 2011, data timeliness (days from vaccine administration to date information was received) and completeness (percentage of records received that include core data elements for electronic storage) were compared by file submission method. Data submitted using Health Level 7 were substantially more timely than data submitted using the flat file method. Additionally, data submitted using Health Level 7 were substantially more complete for each of the core elements compared to flat file submission. Health care organizations that submit electronic data to immunization information systems should be aware that the technical decision to use the Health Level 7 format, particularly if real-time data exchange is employed, can result in more timely and accurate data. This will assist clinicians in adhering to the Advisory Committee on Immunization Practices schedule and reducing over-immunization.

  11. Three-dimensional arbitrary voxel shapes in spectroscopy with submillisecond TEs.

    PubMed

    Snyder, Jeff; Haas, Martin; Dragonu, Iulius; Hennig, Jürgen; Zaitsev, Maxim

    2012-08-01

    A novel spectroscopic method for submillisecond TEs and three-dimensional arbitrarily shaped voxels was developed and applied to phantom and in vivo measurements, with additional parallel excitation (PEX) implementation. A segmented spherical shell excitation trajectory was used in combination with appropriate radiofrequency weights for target selection in three dimensions. Measurements in a two-compartment phantom realized a TE of 955 µs, excellent spectral quality and comparable signal-to-noise ratios between accelerated (R = 2) and nonaccelerated modes. The two-compartment model allowed a comparison of the spectral suppression qualities of the method and, although outer volume signals were suppressed by factors of 1434 and 2246 compared with the theoretical unsuppressed case for the clinical and PEX modes, respectively, incomplete suppression of the outer volume (935 cm(3) compared with a target volume of 5.86 cm(3) ) resulted in a spectral contamination of 10.2% and 6.5% compared with the total signal. The method was also demonstrated in vivo in human brain on a clinical system at TE = 935 µs with good signal-to-noise ratio and spatial and spectral selection, and included LCModel relative quantification analysis. Eight metabolites showed significant fitting accuracy, including aspartate, N-acetylaspartylglutamate, glutathione and glutamate. Copyright © 2012 John Wiley & Sons, Ltd.

  12. Mean centering of ratio spectra and concentration augmented classical least squares in a comparative approach for quantitation of spectrally overlapped bands of antihypertensives in formulations

    NASA Astrophysics Data System (ADS)

    Hegazy, Maha Abdel Monem; Fayez, Yasmin Mohammed

    2015-04-01

    Two different methods manipulating spectrophotometric data have been developed, validated and compared. One is capable of removing the signal of any interfering components at the selected wavelength of the component of interest (univariate). The other includes more variables and extracts maximum information to determine the component of interest in the presence of other components (multivariate). The applied methods are smart, simple, accurate, sensitive, precise and capable of determination of spectrally overlapped antihypertensives; hydrochlorothiazide (HCT), irbesartan (IRB) and candesartan (CAN). Mean centering of ratio spectra (MCR) and concentration residual augmented classical least-squares method (CRACLS) were developed and their efficiency was compared. CRACLS is a simple method that is capable of extracting the pure spectral profiles of each component in a mixture. Correlation was calculated between the estimated and pure spectra and was found to be 0.9998, 0.9987 and 0.9992 for HCT, IRB and CAN, respectively. The methods were successfully determined the three components in bulk powder, laboratory-prepared mixtures, and combined dosage forms. The results obtained were compared statistically with each other and to those of the official methods.

  13. Comparison of family-planning service quality reported by adolescents and young adult women in Mexico.

    PubMed

    Darney, Blair G; Saavedra-Avendano, Biani; Sosa-Rubi, Sandra G; Lozano, Rafael; Rodriguez, Maria I

    2016-07-01

    Associations between age and patient-reported quality of family planning services were examined among young women in Mexico. A repeated cross-sectional analysis of survey data collected in 2006, 2009, and 2014 was performed. Data from women aged 15-29years who had not undergone sterilization and were currently using a modern contraceptive method were included. The primary outcome was high-quality care, defined as positive responses to all five quality items regarding contraceptive services included in the survey. Multivariable logistic regression and marginal probabilities were used to compare adolescents and women aged 20-29years. The responses of respondents using different contraceptive methods were compared. Data were included from 15 835 individuals. The multivariable analysis demonstrated lower odds of reporting high-quality care among women aged 15-19years (odds ratio 0.73; 95% confidence interval 0.60-0.88) and 20-24years (odds ratio 0.85; 95% confidence interval 0.75-0.96) compared with women aged 25-29years. Adolescents using hormonal and long-acting reversible contraception had significantly lower odds of reporting high-quality care compared with women aged 25-29. Adolescents in Mexico reported a lower quality of family planning services compared with young adult women. Continued research and policies are needed to improve the quality of contraceptive services. Copyright © 2016 International Federation of Gynecology and Obstetrics. Published by Elsevier Ireland Ltd. All rights reserved.

  14. Bayesian data analysis in observational comparative effectiveness research: rationale and examples.

    PubMed

    Olson, William H; Crivera, Concetta; Ma, Yi-Wen; Panish, Jessica; Mao, Lian; Lynch, Scott M

    2013-11-01

    Many comparative effectiveness research and patient-centered outcomes research studies will need to be observational for one or both of two reasons: first, randomized trials are expensive and time-consuming; and second, only observational studies can answer some research questions. It is generally recognized that there is a need to increase the scientific validity and efficiency of observational studies. Bayesian methods for the design and analysis of observational studies are scientifically valid and offer many advantages over frequentist methods, including, importantly, the ability to conduct comparative effectiveness research/patient-centered outcomes research more efficiently. Bayesian data analysis is being introduced into outcomes studies that we are conducting. Our purpose here is to describe our view of some of the advantages of Bayesian methods for observational studies and to illustrate both realized and potential advantages by describing studies we are conducting in which various Bayesian methods have been or could be implemented.

  15. Diagnostic for two-mode variable valve activation device

    DOEpatents

    Fedewa, Andrew M

    2014-01-07

    A method is provided for diagnosing a multi-mode valve train device which selectively provides high lift and low lift to a combustion valve of an internal combustion engine having a camshaft phaser actuated by an electric motor. The method includes applying a variable electric current to the electric motor to achieve a desired camshaft phaser operational mode and commanding the multi-mode valve train device to a desired valve train device operational mode selected from a high lift mode and a low lift mode. The method also includes monitoring the variable electric current and calculating a first characteristic of the parameter. The method also includes comparing the calculated first characteristic against a predetermined value of the first characteristic measured when the multi-mode valve train device is known to be in the desired valve train device operational mode.

  16. Accounting for Scale Heterogeneity in Healthcare-Related Discrete Choice Experiments when Comparing Stated Preferences: A Systematic Review.

    PubMed

    Wright, Stuart J; Vass, Caroline M; Sim, Gene; Burton, Michael; Fiebig, Denzil G; Payne, Katherine

    2018-02-28

    Scale heterogeneity, or differences in the error variance of choices, may account for a significant amount of the observed variation in the results of discrete choice experiments (DCEs) when comparing preferences between different groups of respondents. The aim of this study was to identify if, and how, scale heterogeneity has been addressed in healthcare DCEs that compare the preferences of different groups. A systematic review identified all healthcare DCEs published between 1990 and February 2016. The full-text of each DCE was then screened to identify studies that compared preferences using data generated from multiple groups. Data were extracted and tabulated on year of publication, samples compared, tests for scale heterogeneity, and analytical methods to account for scale heterogeneity. Narrative analysis was used to describe if, and how, scale heterogeneity was accounted for when preferences were compared. A total of 626 healthcare DCEs were identified. Of these 199 (32%) aimed to compare the preferences of different groups specified at the design stage, while 79 (13%) compared the preferences of groups identified at the analysis stage. Of the 278 included papers, 49 (18%) discussed potential scale issues, 18 (7%) used a formal method of analysis to account for scale between groups, and 2 (1%) accounted for scale differences between preference groups at the analysis stage. Scale heterogeneity was present in 65% (n = 13) of studies that tested for it. Analytical methods to test for scale heterogeneity included coefficient plots (n = 5, 2%), heteroscedastic conditional logit models (n = 6, 2%), Swait and Louviere tests (n = 4, 1%), generalised multinomial logit models (n = 5, 2%), and scale-adjusted latent class analysis (n = 2, 1%). Scale heterogeneity is a prevalent issue in healthcare DCEs. Despite this, few published DCEs have discussed such issues, and fewer still have used formal methods to identify and account for the impact of scale heterogeneity. The use of formal methods to test for scale heterogeneity should be used, otherwise the results of DCEs potentially risk producing biased and potentially misleading conclusions regarding preferences for aspects of healthcare.

  17. Comparison of femur tunnel aperture location in patients undergoing transtibial and anatomical single-bundle anterior cruciate ligament reconstruction.

    PubMed

    Lee, Dae-Hee; Kim, Hyun-Jung; Ahn, Hyeong-Sik; Bin, Seong-Il

    2016-12-01

    Although three-dimensional computed tomography (3D-CT) has been used to compare femoral tunnel position following transtibial and anatomical anterior cruciate ligament (ACL) reconstruction, no consensus has been reached on which technique results in a more anatomical position because methods of quantifying femoral tunnel position on 3D-CT have not been consistent. This meta-analysis was therefore performed to compare femoral tunnel location following transtibial and anatomical ACL reconstruction, in both the low-to-high and deep-to-shallow directions. This meta-analysis included all studies that used 3D-CT to compare femoral tunnel location, using quadrant or anatomical coordinate axis methods, following transtibial and anatomical (AM portal or OI) single-bundle ACL reconstruction. Six studies were included in the meta-analysis. Femoral tunnel location was 18 % higher in the low-to-high direction, but was not significant in the deep-to-shallow direction, using the transtibial technique than the anatomical methods, when measured using the anatomical coordinate axis method. When measured using the quadrant method, however, femoral tunnel positions were significantly higher (21 %) and shallower (6 %) with transtibial than anatomical methods of ACL reconstruction. The anatomical ACL reconstruction techniques led to a lower femoral tunnel aperture location than the transtibial technique, suggesting the superiority of anatomical techniques for creating new femoral tunnels during revision ACL reconstruction in femoral tunnel aperture location in the low-to-high direction. However, the mean difference in the deep-to-shallow direction differed by method of measurement. Meta-analysis, Level II.

  18. Production and use of metals and oxygen for lunar propulsion

    NASA Technical Reports Server (NTRS)

    Hepp, Aloysius F.; Linne, Diane L.; Landis, Geoffrey A.; Groth, Mary F.; Colvin, James E.

    1991-01-01

    Production, power, and propulsion technologies for using oxygen and metals derived from lunar resources are discussed. The production process is described, and several of the more developed processes are discussed. Power requirements for chemical, thermal, and electrical production methods are compared. The discussion includes potential impact of ongoing power technology programs on lunar production requirements. The performance potential of several possible metal fuels including aluminum, silicon, iron, and titanium are compared. Space propulsion technology in the area of metal/oxygen rocket engines is discussed.

  19. Computation of Pressurized Gas Bearings Using CE/SE Method

    NASA Technical Reports Server (NTRS)

    Cioc, Sorin; Dimofte, Florin; Keith, Theo G., Jr.; Fleming, David P.

    2003-01-01

    The space-time conservation element and solution element (CE/SE) method is extended to compute compressible viscous flows in pressurized thin fluid films. This numerical scheme has previously been used successfully to solve a wide variety of compressible flow problems, including flows with large and small discontinuities. In this paper, the method is applied to calculate the pressure distribution in a hybrid gas journal bearing. The formulation of the problem is presented, including the modeling of the feeding system. the numerical results obtained are compared with experimental data. Good agreement between the computed results and the test data were obtained, and thus validate the CE/SE method to solve such problems.

  20. Experimental Methods for Protein Interaction Identification and Characterization

    NASA Astrophysics Data System (ADS)

    Uetz, Peter; Titz, Björn; Cagney, Gerard

    There are dozens of methods for the detection of protein-protein interactions but they fall into a few broad categories. Fragment complementation assays such as the yeast two-hybrid (Y2H) system are based on split proteins that are functionally reconstituted by fusions of interacting proteins. Biophysical methods include structure determination and mass spectrometric (MS) identification of proteins in complexes. Biochemical methods include methods such as far western blotting and peptide arrays. Only the Y2H and protein complex purification combined with MS have been used on a larger scale. Due to the lack of data it is still difficult to compare these methods with respect to their efficiency and error rates. Current data does not favor any particular method and thus multiple experimental approaches are necessary to maximally cover the interactome of any target cell or organism.

  1. Full-Potential Modeling of Blade-Vortex Interactions. Degree awarded by George Washington Univ., Feb. 1987

    NASA Technical Reports Server (NTRS)

    Jones, Henry E.

    1997-01-01

    A study of the full-potential modeling of a blade-vortex interaction was made. A primary goal of this study was to investigate the effectiveness of the various methods of modeling the vortex. The model problem restricts the interaction to that of an infinite wing with an infinite line vortex moving parallel to its leading edge. This problem provides a convenient testing ground for the various methods of modeling the vortex while retaining the essential physics of the full three-dimensional interaction. A full-potential algorithm specifically tailored to solve the blade-vortex interaction (BVI) was developed to solve this problem. The basic algorithm was modified to include the effect of a vortex passing near the airfoil. Four different methods of modeling the vortex were used: (1) the angle-of-attack method, (2) the lifting-surface method, (3) the branch-cut method, and (4) the split-potential method. A side-by-side comparison of the four models was conducted. These comparisons included comparing generated velocity fields, a subcritical interaction, and a critical interaction. The subcritical and critical interactions are compared with experimentally generated results. The split-potential model was used to make a survey of some of the more critical parameters which affect the BVI.

  2. Methylxanthines: properties and determination in various objects

    NASA Astrophysics Data System (ADS)

    Andreeva, Elena Yu; Dmitrienko, Stanislava G.; Zolotov, Yurii A.

    2012-05-01

    Published data on the properties and determination of caffeine, theophylline, theobromine and some other methylxanthines in various objects are surveyed and described systematically. Different sample preparation procedures such as liquid extraction from solid matrices and liquid-liquid, supercritical fluid and solid-phase extraction are compared. The key methods of analysis including chromatography, electrophoresis, spectrometry and electrochemical methods are discussed. Examples of methylxanthine determination in plants, food products, energy beverages, pharmaceuticals, biological fluids and natural and waste waters are given. The bibliography includes 393 references.

  3. Method for making a photodetector with enhanced light absorption

    DOEpatents

    Kane, James

    1987-05-05

    A photodetector including a light transmissive electrically conducting layer having a textured surface with a semiconductor body thereon. This layer traps incident light thereby enhancing the absorption of light by the semiconductor body. A photodetector comprising a textured light transmissive electrically conducting layer of SnO.sub.2 and a body of hydrogenated amorphous silicon has a conversion efficiency about fifty percent greater than that of comparative cells. The invention also includes a method of fabricating the photodetector of the invention.

  4. Method and apparatus for identifying, locating and quantifying physical phenomena and structure including same

    DOEpatents

    Richardson, John G.

    2006-01-24

    A method and system for detecting, locating and quantifying a physical phenomena such as strain or a deformation in a structure. A minimum resolvable distance along the structure is selected and a quantity of laterally adjacent conductors is determined. Each conductor includes a plurality of segments coupled in series which define the minimum resolvable distance along the structure. When a deformation occurs, changes in the defined energy transmission characteristics along each conductor are compared to determine which segment contains the deformation.

  5. Applicability of linearized-theory attached-flow methods to design and analysis of flap systems at low speeds for thin swept wings with sharp leading edges

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Darden, Christine M.

    1987-01-01

    Low-speed experimental force and data on a series of thin swept wings with sharp leading edges and leading and trailing-edge flaps are compared with predictions made using a linearized-theory method which includes estimates of vortex forces. These comparisons were made to assess the effectiveness of linearized-theory methods for use in the design and analysis of flap systems in subsonic flow. Results demonstrate that linearized-theory, attached-flow methods (with approximate representation of vortex forces) can form the basis of a rational system for flap design and analysis. Even attached-flow methods that do not take vortex forces into account can be used for the selection of optimized flap-system geometry, but design-point performance levels tend to be underestimated unless vortex forces are included. Illustrative examples of the use of these methods in the design of efficient low-speed flap systems are included.

  6. Motivations for Sex among Low-Income African American Young Women

    ERIC Educational Resources Information Center

    Deardorff, Julianna; Suleiman, Ahna Ballonoff; Dal Santo, Teresa S.; Flythe, Michelle; Gurdin, J. Barry; Eyre, Stephen L.

    2013-01-01

    African American young women exhibit higher risk for sexually transmitted infections, including HIV/AIDS, compared with European American women, and this is particularly true for African American women living in low-income contexts. We used rigorous qualitative methods, that is, domain analysis, including free listing ("n" = 20),…

  7. From comparative effectiveness research to patient-centered outcomes research: integrating emergency care goals, methods, and priorities.

    PubMed

    Meisel, Zachary F; Carr, Brendan G; Conway, Patrick H

    2012-09-01

    Federal legislation placed comparative effectiveness research and patient-centered outcomes research at the center of current and future national investments in health care research. The role of this research in emergency care has not been well described. This article proposes an agenda for researchers and health care providers to consider comparative effectiveness research and patient-centered outcomes research methods and results to improve the care for patients who seek, use, and require emergency care. This objective will be accomplished by (1) exploring the definitions, frameworks, and nomenclature for comparative effectiveness research and patient-centered outcomes research; (2) describing a conceptual model for comparative effectiveness research in emergency care; (3) identifying specific opportunities and examples of emergency care-related comparative effectiveness research; and (4) categorizing current and planned funding for comparative effectiveness research and patient-centered outcomes research that can include emergency care delivery. Copyright © 2012. Published by Mosby, Inc.

  8. Wireless device monitoring methods, wireless device monitoring systems, and articles of manufacture

    DOEpatents

    McCown, Steven H [Rigby, ID; Derr, Kurt W [Idaho Falls, ID; Rohde, Kenneth W [Idaho Falls, ID

    2012-05-08

    Wireless device monitoring methods, wireless device monitoring systems, and articles of manufacture are described. According to one embodiment, a wireless device monitoring method includes accessing device configuration information of a wireless device present at a secure area, wherein the device configuration information comprises information regarding a configuration of the wireless device, accessing stored information corresponding to the wireless device, wherein the stored information comprises information regarding the configuration of the wireless device, comparing the device configuration information with the stored information, and indicating the wireless device as one of authorized and unauthorized for presence at the secure area using the comparing.

  9. An improved high-throughput lipid extraction method for the analysis of human brain lipids.

    PubMed

    Abbott, Sarah K; Jenner, Andrew M; Mitchell, Todd W; Brown, Simon H J; Halliday, Glenda M; Garner, Brett

    2013-03-01

    We have developed a protocol suitable for high-throughput lipidomic analysis of human brain samples. The traditional Folch extraction (using chloroform and glass-glass homogenization) was compared to a high-throughput method combining methyl-tert-butyl ether (MTBE) extraction with mechanical homogenization utilizing ceramic beads. This high-throughput method significantly reduced sample handling time and increased efficiency compared to glass-glass homogenizing. Furthermore, replacing chloroform with MTBE is safer (less carcinogenic/toxic), with lipids dissolving in the upper phase, allowing for easier pipetting and the potential for automation (i.e., robotics). Both methods were applied to the analysis of human occipital cortex. Lipid species (including ceramides, sphingomyelins, choline glycerophospholipids, ethanolamine glycerophospholipids and phosphatidylserines) were analyzed via electrospray ionization mass spectrometry and sterol species were analyzed using gas chromatography mass spectrometry. No differences in lipid species composition were evident when the lipid extraction protocols were compared, indicating that MTBE extraction with mechanical bead homogenization provides an improved method for the lipidomic profiling of human brain tissue.

  10. Assessing signatures of selection through variation in linkage disequilibrium between taurine and indicine cattle

    PubMed Central

    2014-01-01

    Background Signatures of selection are regions in the genome that have been preferentially increased in frequency and fixed in a population because of their functional importance in specific processes. These regions can be detected because of their lower genetic variability and specific regional linkage disequilibrium (LD) patterns. Methods By comparing the differences in regional LD variation between dairy and beef cattle types, and between indicine and taurine subspecies, we aim at finding signatures of selection for production and adaptation in cattle breeds. The VarLD method was applied to compare the LD variation in the autosomal genome between breeds, including Angus and Brown Swiss, representing taurine breeds, and Nelore and Gir, representing indicine breeds. Genomic regions containing the top 0.01 and 0.1 percentile of signals were characterized using the UMD3.1 Bos taurus genome assembly to identify genes in those regions and compared with previously reported selection signatures and regions with copy number variation. Results For all comparisons, the top 0.01 and 0.1 percentile included 26 and 165 signals and 17 and 125 genes, respectively, including TECRL, BT.23182 or FPPS, CAST, MYOM1, UVRAG and DNAJA1. Conclusions The VarLD method is a powerful tool to identify differences in linkage disequilibrium between cattle populations and putative signatures of selection with potential adaptive and productive importance. PMID:24592996

  11. Use of a problem-based learning teaching model for undergraduate medical and nursing education: a systematic review and meta-analysis

    PubMed Central

    Sayyah, Mehdi; Shirbandi, Kiarash; Saki-Malehi, Amal; Rahim, Fakher

    2017-01-01

    Objectives The aim of this systematic review and meta-analysis was to evaluate the problem-based learning (PBL) method as an alternative to conventional educational methods in Iranian undergraduate medical courses. Materials and methods We systematically searched international datasets banks, including PubMed, Scopus, and Embase, and internal resources of banks, including MagirIran, IranMedex, IranDoc, and Scientific Information Database (SID), using appropriate search terms, such as “PBL”, “problem-based learning”, “based on problems”, “active learning”, and“ learner centered”, to identify PBL studies, and these were combined with other key terms such as “medical”, “undergraduate”, “Iranian”, “Islamic Republic of Iran”, “I.R. of Iran”, and “Iran”. The search included the period from 1980 to 2016 with no language limits. Results Overall, a total of 1,057 relevant studies were initially found, of which 21 studies were included in the systematic review and meta-analysis. Of the 21 studies, 12 (57.14%) had a high methodological quality. Considering the pooled effect size data, there was a significant difference in the scores (standardized mean difference [SMD]=0.80, 95% CI [0.52, 1.08], P<0.000) in favor of PBL, compared with the lecture-based method. Subgroup analysis revealed that using PBL alone is more favorable compared to using a mixed model with other learning methods such as lecture-based learning (LBL). Conclusion The results of this systematic review showed that using PBL may have a positive effect on the academic achievement of undergraduate medical courses. The results suggest that teachers and medical education decision makers give more attention on using this method for effective and proper training. PMID:29042827

  12. Application of differential transformation method for solving dengue transmission mathematical model

    NASA Astrophysics Data System (ADS)

    Ndii, Meksianis Z.; Anggriani, Nursanti; Supriatna, Asep K.

    2018-03-01

    The differential transformation method (DTM) is a semi-analytical numerical technique which depends on Taylor series and has application in many areas including Biomathematics. The aim of this paper is to employ the differential transformation method (DTM) to solve system of non-linear differential equations for dengue transmission mathematical model. Analytical and numerical solutions are determined and the results are compared to that of Runge-Kutta method. We found a good agreement between DTM and Runge-Kutta method.

  13. Robust sleep quality quantification method for a personal handheld device.

    PubMed

    Shin, Hangsik; Choi, Byunghun; Kim, Doyoon; Cho, Jaegeol

    2014-06-01

    The purpose of this study was to develop and validate a novel method for sleep quality quantification using personal handheld devices. The proposed method used 3- or 6-axes signals, including acceleration and angular velocity, obtained from built-in sensors in a smartphone and applied a real-time wavelet denoising technique to minimize the nonstationary noise. Sleep or wake status was decided on each axis, and the totals were finally summed to calculate sleep efficiency (SE), regarded as sleep quality in general. The sleep experiment was carried out for performance evaluation of the proposed method, and 14 subjects participated. An experimental protocol was designed for comparative analysis. The activity during sleep was recorded not only by the proposed method but also by well-known commercial applications simultaneously; moreover, activity was recorded on different mattresses and locations to verify the reliability in practical use. Every calculated SE was compared with the SE of a clinically certified medical device, the Philips (Amsterdam, The Netherlands) Actiwatch. In these experiments, the proposed method proved its reliability in quantifying sleep quality. Compared with the Actiwatch, accuracy and average bias error of SE calculated by the proposed method were 96.50% and -1.91%, respectively. The proposed method was vastly superior to other comparative applications with at least 11.41% in average accuracy and at least 6.10% in average bias; average accuracy and average absolute bias error of comparative applications were 76.33% and 17.52%, respectively.

  14. A comparative study of three cytotoxicity test methods for nanomaterials using sodium lauryl sulfate.

    PubMed

    Kwon, Jae-Sung; Kim, Kwang-Mahn; Kim, Kyoung-Nam

    2014-10-01

    The biocompatibility evaluation of nanomaterials is essential for their medical diagnostic and therapeutic usage, where a cytotoxicity test is the simplest form of biocompatibility evaluation. Three methods have been commonly used in previous studies for the cytotoxicity testing of nanomaterials: trypan blue exclusion, colorimetric assay using water soluble tetrazolium (WST), and imaging under a microscope following calcein AM/ethidium homodimer-1 staining. However, there has yet to be a study to compare each method. Therefore, in this study three methods were compared using the standard reference material of sodium lauryl sulfate (SLS). Each method of the cytotoxicity test was carried out using mouse fibroblasts of L-929 exposed to different concentrations of SLS. Compared to the gold standard trypan blue exclusion test, both colorimetric assay using water soluble tetrazolium (WST) and imaging under microscope with calcein AM/ethidium homodimer-1 staining showed results that were not statistically different. Also, each method exhibited various advantages and disadvantages, which included the need of equipment, time taken for the experiment, and provision of additional information such as cell morphology. Therefore, this study concludes that all three methods of cytotoxicity testing may be valid, though careful consideration will be needed when selecting tests with regard to time, finances, and the amount of information required by the researcher(s).

  15. Finite difference and Runge-Kutta methods for solving vibration problems

    NASA Astrophysics Data System (ADS)

    Lintang Renganis Radityani, Scolastika; Mungkasi, Sudi

    2017-11-01

    The vibration of a storey building can be modelled into a system of second order ordinary differential equations. If the number of floors of a building is large, then the result is a large scale system of second order ordinary differential equations. The large scale system is difficult to solve, and if it can be solved, the solution may not be accurate. Therefore, in this paper, we seek for accurate methods for solving vibration problems. We compare the performance of numerical finite difference and Runge-Kutta methods for solving large scale systems of second order ordinary differential equations. The finite difference methods include the forward and central differences. The Runge-Kutta methods include the Euler and Heun methods. Our research results show that the central finite difference and the Heun methods produce more accurate solutions than the forward finite difference and the Euler methods do.

  16. Clinical outcomes of using lasers for peri-implantitis surface detoxification: a systematic review and meta-analysis.

    PubMed

    Mailoa, James; Lin, Guo-Hao; Chan, Hsun-Liang; MacEachern, Mark; Wang, Hom-Lay

    2014-09-01

    The aim of this systematic review is to compare the clinical outcomes of lasers with other commonly applied detoxification methods for treating peri-implantitis. An electronic search of four databases and a hand search of peer-reviewed journals for relevant articles were conducted. Comparative human clinical trials and case series with ≥ 6 months of follow-up in ≥ 10 patients with peri-implantitis treated with lasers were included. Additionally, animal studies applying lasers for treating peri-implantitis were also included. The included studies had to report probing depth (PD) reduction after the therapy. Seven human prospective clinical trials and two animal studies were included. In four and three human studies, lasers were accompanied with surgical and non-surgical treatments, respectively. The meta-analyses showed an overall weighted mean difference of 0.00 mm (95% confidence interval = -0.18 to 0.19 mm) PD reduction between the laser and conventional treatment groups (P = 0.98) for non-surgical intervention. In animal studies, laser-treated rough-surface implants had a higher percentage of bone-to-implant contact than smooth-surface implants. In a short-term follow-up, lasers resulted in similar PD reduction when compared with conventional implant surface decontamination methods.

  17. Computer-Assisted, Programmed Text, and Lecture Modes of Instruction in Three Medical Training Courses: Comparative Evaluation. Final Report.

    ERIC Educational Resources Information Center

    Deignan, Gerard M.; And Others

    This report contains a comparative analysis of the differential effectiveness of computer-assisted instruction (CAI), programmed instructional text (PIT), and lecture methods of instruction in three medical courses--Medical Laboratory, Radiology, and Dental. The summative evaluation includes (1) multiple regression analyses conducted to predict…

  18. Exercise and Cardiometabolic Risk Factors in Graduate Students: A Longitudinal, Observational Study

    ERIC Educational Resources Information Center

    Racette, Susan B.; Inman, Cindi L.; Clark, B. Ruth; Royer, Nathaniel K.; Steger-May, Karen; Deusinger, Susan S.

    2014-01-01

    Objective: To evaluate cardiometabolic risk of students longitudinally and compare them with age-matched national samples. Participants: Participants are 134 graduate students enrolled between August 2005 and May 2010. Methods: Students were assessed at the beginning and end of their 3-year curriculum. Comparative samples included 966 National…

  19. Measuring Person-Centered Care: A Critical Comparative Review of Published Tools

    ERIC Educational Resources Information Center

    Edvardsson, David; Innes, Anthea

    2010-01-01

    Purpose of the study: To present a critical comparative review of published tools measuring the person-centeredness of care for older people and people with dementia. Design and Methods: Included tools were identified by searches of PubMed, Cinahl, the Bradford Dementia Group database, and authors' files. The terms "Person-centered,"…

  20. The Effect of Computer Games on Students' Critical Thinking Disposition and Educational Achievement

    ERIC Educational Resources Information Center

    Seifi, Mohammad; Derikvandi, Zahra; Moosavipour, Saeed; Khodabandelou, Rouhollah

    2015-01-01

    The main aim of this research was to investigate the effect of computer games on student' critical thinking disposition and educational achievement. The research method was descriptive, and its type was casual-comparative. The sample included 270 female high school students in Andimeshk town selected by multistage cluster method. Ricketts…

  1. Three Interaction Patterns on Asynchronous Online Discussion Behaviours: A Methodological Comparison

    ERIC Educational Resources Information Center

    Jo, I.; Park, Y.; Lee, H.

    2017-01-01

    An asynchronous online discussion (AOD) is one format of instructional methods that facilitate student-centered learning. In the wealth of AOD research, this study evaluated how students' behavior on AOD influences their academic outcomes. This case study compared the differential analytic methods including web log mining, social network analysis…

  2. An Empirical Review of Research Methodologies and Methods in Creativity Studies (2003-2012)

    ERIC Educational Resources Information Center

    Long, Haiying

    2014-01-01

    Based on the data collected from 5 prestigious creativity journals, research methodologies and methods of 612 empirical studies on creativity, published between 2003 and 2012, were reviewed and compared to those in gifted education. Major findings included: (a) Creativity research was predominantly quantitative and psychometrics and experiment…

  3. A Comparison of Conventional Linear Regression Methods and Neural Networks for Forecasting Educational Spending.

    ERIC Educational Resources Information Center

    Baker, Bruce D.; Richards, Craig E.

    1999-01-01

    Applies neural network methods for forecasting 1991-95 per-pupil expenditures in U.S. public elementary and secondary schools. Forecasting models included the National Center for Education Statistics' multivariate regression model and three neural architectures. Regarding prediction accuracy, neural network results were comparable or superior to…

  4. A Comparison of Isotonic, Isokinetic, and Plyometric Training Methods for Vertical Jump Improvement.

    ERIC Educational Resources Information Center

    Miller, Christine D.

    This annotated bibliography documents three training methods used to develop vertical jumping ability and power: isotonic, isokinetics, and plyometric training. Research findings on all three forms of training are summarized and compared. A synthesis of conclusions drawn from the annotated writings is presented. The report includes a glossary of…

  5. Impacts of Contextual and Explicit Instruction on Preservice Elementary Teachers' Understandings of the Nature of Science

    ERIC Educational Resources Information Center

    Bell, Randy L.; Matkins, Juanita Jo; Gansneder, Bruce M.

    2011-01-01

    This mixed-methods investigation compared the relative impacts of instructional approach and context of nature of science instruction on preservice elementary teachers' understandings. The sample consisted of 75 preservice teachers enrolled in four sections of an elementary science methods course. Independent variables included instructional…

  6. COMPARISON OF MEMBRANE FILTER, MULTIPLE-FERMENTATION-TUBE, AND PRESENCE-ABSENCE TECHNIQUES FOR DETECTING TOTAL COLIFORMS IN SMALL COMMUNITY WATER SYSTEMS

    EPA Science Inventory

    Methods for detecting total coliform bacteria in drinking water were compared using 1483 different drinking water samples from 15 small community water systems in Vermont and New Hampshire. The methods included the membrane filter (MF) technique, a ten tube fermentation tube tech...

  7. New generation all-silica based optical elements for high power laser systems

    NASA Astrophysics Data System (ADS)

    Tolenis, T.; GrinevičiÅ«tÄ--, L.; Melninkaitis, A.; Selskis, A.; Buzelis, R.; MažulÄ--, L.; Drazdys, R.

    2017-08-01

    Laser resistance of optical elements is one of the major topics in photonics. Various routes have been taken to improve optical coatings, including, but not limited by, materials engineering and optimisation of electric field distribution in multilayers. During the decades of research, it was found, that high band-gap materials, such as silica, are highly resistant to laser light. Unfortunately, only the production of anti-reflection coatings of all-silica materials are presented to this day. A novel route will be presented in materials engineering, capable to manufacture high reflection optical elements using only SiO2 material and GLancing Angle Deposition (GLAD) method. The technique involves the deposition of columnar structure and tailoring the refractive index of silica material throughout the coating thickness. A numerous analysis indicate the superior properties of GLAD coatings when compared with standard methods for Bragg mirrors production. Several groups of optical components are presented including anti-reflection coatings and Bragg mirrors. Structural and optical characterisation of the method have been performed and compared with standard methods. All researches indicate the possibility of new generation coatings for high power laser systems.

  8. Paracoccidioidomycosis: Current Perspectives from Brazil

    PubMed Central

    Mendes, Rinaldo Poncio; Cavalcante, Ricardo de Souza; Marques, Sílvio Alencar; Marques, Mariângela Esther Alencar; Venturini, James; Sylvestre, Tatiane Fernanda; Paniago, Anamaria Mello Miranda; Pereira, Ana Carla; da Silva, Julhiany de Fátima; Fabro, Alexandre Todorovic; Bosco, Sandra de Moraes Gimenes; Bagagli, Eduardo; Hahn, Rosane Christine; Levorato, Adriele Dandara

    2017-01-01

    Background: This review article summarizes and updates the knowledge on paracoccidioidomycosis. P lutzii and the cryptic species of P. brasiliensis and their geographical distribution in Latin America, explaining the difficulties observed in the serological diagnosis. Objectives: Emphasis has been placed on some genetic factors as predisposing condition for paracoccidioidomycosis. Veterinary aspects were focused, showing the wide distribution of infection among animals. The cell-mediated immunity was better characterized, incorporating the recent findings. Methods: Serological methods for diagnosis were also compared for their parameters of accuracy, including the analysis of relapse. Results: Clinical forms have been better classified in order to include the pictures less frequently observesiod. Conclusion: Itraconazole and the trimethoprim-sulfamethoxazole combination was compared regarding efficacy, effectiveness and safety, demonstrating that azole should be the first choice in the treatment of paracoccidioidomycosis. PMID:29204222

  9. Behavior Therapy for Tourette Syndrome: A Systematic Review and Meta-analysis.

    PubMed

    Wile, Daryl J; Pringsheim, Tamara M

    2013-08-01

    When tics caused by Tourette Syndrome cause meaningful impairment for patients, a comprehensive treatment approach includes education of patients, peers, and family, treatment of comorbid behavioral disorders if present, and consideration of behavior therapy and pharmacotherapy for tics themselves. This systematic review and meta-analysis demonstrates that behavior therapies based on Habit Reversal Therapy, including the Comprehensive Behavioral Intervention for Tics are effective in reducing tic severity when compared with supportive psychotherapy. When these behavior therapies are unavailable, Exposure with Response Prevention may also be effective. Both face-to-face and telehealth delivery methods for behavior therapy improve tic severity, and broader distribution of behavior therapy through increased training or telehealth methods is encouraged. High-quality randomized trials comparing behavior therapies for tics with pharmacotherapy are needed.

  10. [Comparative analysis between diatom nitric acid digestion method and plankton 16S rDNA PCR method].

    PubMed

    Han, Jun-ge; Wang, Cheng-bao; Li, Xing-biao; Fan, Yan-yan; Feng, Xiang-ping

    2013-10-01

    To compare and explore the application value of diatom nitric acid digestion method and plankton 16S rDNA PCR method for drowning identification. Forty drowning cases from 2010 to 2011 were collected from Department of Forensic Medicine of Wenzhou Medical University. Samples including lung, kidney, liver and field water from each case were tested with diatom nitric acid digestion method and plankton 16S rDNA PCR method, respectively. The Diatom nitric acid digestion method and plankton 16S rDNA PCR method required 20 g and 2 g of each organ, and 15 mL and 1.5 mL of field water, respectively. The inspection time and detection rate were compared between the two methods. Diatom nitric acid digestion method mainly detected two species of diatoms, Centriae and Pennatae, while plankton 16S rDNA PCR method amplified a length of 162 bp band. The average inspection time of each case of the Diatom nitric acid digestion method was (95.30 +/- 2.78) min less than (325.33 +/- 14.18) min of plankton 16S rDNA PCR method (P < 0.05). The detection rates of two methods for field water and lung were both 100%. For liver and kidney, the detection rate of plankton 16S rDNA PCR method was both 80%, higher than 40% and 30% of diatom nitric acid digestion method (P < 0.05), respectively. The laboratory testing method needs to be appropriately selected according to the specific circumstances in the forensic appraisal of drowning. Compared with diatom nitric acid digestion method, plankton 16S rDNA PCR method has practice values with such advantages as less quantity of samples, huge information and high specificity.

  11. Analysis of baseline, average, and longitudinally measured blood pressure data using linear mixed models.

    PubMed

    Hossain, Ahmed; Beyene, Joseph

    2014-01-01

    This article compares baseline, average, and longitudinal data analysis methods for identifying genetic variants in genome-wide association study using the Genetic Analysis Workshop 18 data. We apply methods that include (a) linear mixed models with baseline measures, (b) random intercept linear mixed models with mean measures outcome, and (c) random intercept linear mixed models with longitudinal measurements. In the linear mixed models, covariates are included as fixed effects, whereas relatedness among individuals is incorporated as the variance-covariance structure of the random effect for the individuals. The overall strategy of applying linear mixed models decorrelate the data is based on Aulchenko et al.'s GRAMMAR. By analyzing systolic and diastolic blood pressure, which are used separately as outcomes, we compare the 3 methods in identifying a known genetic variant that is associated with blood pressure from chromosome 3 and simulated phenotype data. We also analyze the real phenotype data to illustrate the methods. We conclude that the linear mixed model with longitudinal measurements of diastolic blood pressure is the most accurate at identifying the known single-nucleotide polymorphism among the methods, but linear mixed models with baseline measures perform best with systolic blood pressure as the outcome.

  12. Sociological analysis and comparative education

    NASA Astrophysics Data System (ADS)

    Woock, Roger R.

    1981-12-01

    It is argued that comparative education is essentially a derivative field of study, in that it borrows theories and methods from academic disciplines. After a brief humanistic phase, in which history and philosophy were central for comparative education, sociology became an important source. In the mid-50's and 60's, sociology in the United States was characterised by Structural Functionalism as a theory, and Social Survey as a dominant methodology. Both were incorporated into the development of comparative education. Increasingly in the 70's, and certainly today, the new developments in sociology are characterised by an attack on Positivism, which is seen as the philosophical position underlying both functionalism and survey methods. New or re-discovered theories with their attendant methodologies included Marxism, Phenomenological Sociology, Critical Theory, and Historical Social Science. The current relationship between comparative education and social science is one of uncertainty, but since social science is seen to be returning to its European roots, the hope is held out for the development of an integrated social theory and method which will provide a much stronger basis for developments in comparative education.

  13. Local Intrinsic Dimension Estimation by Generalized Linear Modeling.

    PubMed

    Hino, Hideitsu; Fujiki, Jun; Akaho, Shotaro; Murata, Noboru

    2017-07-01

    We propose a method for intrinsic dimension estimation. By fitting the power of distance from an inspection point and the number of samples included inside a ball with a radius equal to the distance, to a regression model, we estimate the goodness of fit. Then, by using the maximum likelihood method, we estimate the local intrinsic dimension around the inspection point. The proposed method is shown to be comparable to conventional methods in global intrinsic dimension estimation experiments. Furthermore, we experimentally show that the proposed method outperforms a conventional local dimension estimation method.

  14. Random forests for classification in ecology

    USGS Publications Warehouse

    Cutler, D.R.; Edwards, T.C.; Beard, K.H.; Cutler, A.; Hess, K.T.; Gibson, J.; Lawler, J.J.

    2007-01-01

    Classification procedures are some of the most widely used statistical methods in ecology. Random forests (RF) is a new and powerful statistical classifier that is well established in other disciplines but is relatively unknown in ecology. Advantages of RF compared to other statistical classifiers include (1) very high classification accuracy; (2) a novel method of determining variable importance; (3) ability to model complex interactions among predictor variables; (4) flexibility to perform several types of statistical data analysis, including regression, classification, survival analysis, and unsupervised learning; and (5) an algorithm for imputing missing values. We compared the accuracies of RF and four other commonly used statistical classifiers using data on invasive plant species presence in Lava Beds National Monument, California, USA, rare lichen species presence in the Pacific Northwest, USA, and nest sites for cavity nesting birds in the Uinta Mountains, Utah, USA. We observed high classification accuracy in all applications as measured by cross-validation and, in the case of the lichen data, by independent test data, when comparing RF to other common classification methods. We also observed that the variables that RF identified as most important for classifying invasive plant species coincided with expectations based on the literature. ?? 2007 by the Ecological Society of America.

  15. Multiconfiguration pair-density functional theory for doublet excitation energies and excited state geometries: the excited states of CN.

    PubMed

    Bao, Jie J; Gagliardi, Laura; Truhlar, Donald G

    2017-11-15

    Multiconfiguration pair-density functional theory (MC-PDFT) is a post multiconfiguration self-consistent field (MCSCF) method with similar performance to complete active space second-order perturbation theory (CASPT2) but with greater computational efficiency. Cyano radical (CN) is a molecule whose spectrum is well established from experiments and whose excitation energies have been used as a testing ground for theoretical methods to treat excited states of open-shell systems, which are harder and much less studied than excitation energies of closed-shell singlets. In the present work, we studied the adiabatic excitation energies of CN with MC-PDFT. Then we compared this multireference (MR) method to some single-reference (SR) methods, including time-dependent density functional theory (TDDFT) and completely renormalized equation-of-motion coupled-cluster theory with singles, doubles and noniterative triples [CR-EOM-CCSD(T)]; we also compared to some other MR methods, including configuration interaction singles and doubles (MR-CISD) and multistate CASPT2 (MS-CASPT2). Through a comparison between SR and MR methods, we achieved a better appreciation of the need to use MR methods to accurately describe higher excited states, and we found that among the MR methods, MC-PDFT stands out for its accuracy for the first four states out of the five doublet states studied this paper; this shows its efficiency for calculating doublet excited states.

  16. Comparing and combining biomarkers as principle surrogates for time-to-event clinical endpoints.

    PubMed

    Gabriel, Erin E; Sachs, Michael C; Gilbert, Peter B

    2015-02-10

    Principal surrogate endpoints are useful as targets for phase I and II trials. In many recent trials, multiple post-randomization biomarkers are measured. However, few statistical methods exist for comparison of or combination of biomarkers as principal surrogates, and none of these methods to our knowledge utilize time-to-event clinical endpoint information. We propose a Weibull model extension of the semi-parametric estimated maximum likelihood method that allows for the inclusion of multiple biomarkers in the same risk model as multivariate candidate principal surrogates. We propose several methods for comparing candidate principal surrogates and evaluating multivariate principal surrogates. These include the time-dependent and surrogate-dependent true and false positive fraction, the time-dependent and the integrated standardized total gain, and the cumulative distribution function of the risk difference. We illustrate the operating characteristics of our proposed methods in simulations and outline how these statistics can be used to evaluate and compare candidate principal surrogates. We use these methods to investigate candidate surrogates in the Diabetes Control and Complications Trial. Copyright © 2014 John Wiley & Sons, Ltd.

  17. System and method for diagnosing EGR performance using NOx sensor

    DOEpatents

    Mazur, Christopher John

    2003-12-23

    A method and system for diagnosing a condition of an EGR valve used in an engine system. The EGR valve controls the portion exhaust gases produced by such engine system and fed back to an intake of such engine system. The engine system includes a NOx sensor for measuring NOx in such exhaust. The method includes: determining a time rate of change in NOx measured by the NOx sensor; comparing the determined time rate of change in the measured NOx with a predetermined expected time rate of change in measured NOx; and determining the condition of the EGR valve as a function of such comparison. The method also includes: determining from NOx measured by the NOx sensor and engine operating conditions indications of instances when samples of such measured NOx are greater than an expected maximum NOx level for such engine condition and less than an expected minimum NOx level for such engine condition; and determining the condition of the EGR valve as a function of a statistical analysis of such indications. The method includes determining whether the NOx sensor is faulty and wherein the EGR condition determining includes determining whether the NOx sensor is faulty.

  18. Linnorm: improved statistical analysis for single cell RNA-seq expression data

    PubMed Central

    Yip, Shun H.; Wang, Panwen; Kocher, Jean-Pierre A.; Sham, Pak Chung

    2017-01-01

    Abstract Linnorm is a novel normalization and transformation method for the analysis of single cell RNA sequencing (scRNA-seq) data. Linnorm is developed to remove technical noises and simultaneously preserve biological variations in scRNA-seq data, such that existing statistical methods can be improved. Using real scRNA-seq data, we compared Linnorm with existing normalization methods, including NODES, SAMstrt, SCnorm, scran, DESeq and TMM. Linnorm shows advantages in speed, technical noise removal and preservation of cell heterogeneity, which can improve existing methods in the discovery of novel subtypes, pseudo-temporal ordering of cells, clustering analysis, etc. Linnorm also performs better than existing DEG analysis methods, including BASiCS, NODES, SAMstrt, Seurat and DESeq2, in false positive rate control and accuracy. PMID:28981748

  19. Inspection system calibration methods

    DOEpatents

    Deason, Vance A.; Telschow, Kenneth L.

    2004-12-28

    An inspection system calibration method includes producing two sideband signals of a first wavefront; interfering the two sideband signals in a photorefractive material, producing an output signal therefrom having a frequency and a magnitude; and producing a phase modulated operational signal having a frequency different from the output signal frequency, a magnitude, and a phase modulation amplitude. The method includes determining a ratio of the operational signal magnitude to the output signal magnitude, determining a ratio of a 1st order Bessel function of the operational signal phase modulation amplitude to a 0th order Bessel function of the operational signal phase modulation amplitude, and comparing the magnitude ratio to the Bessel function ratio.

  20. Development of new methodologies for evaluating the energy performance of new commercial buildings

    NASA Astrophysics Data System (ADS)

    Song, Suwon

    The concept of Measurement and Verification (M&V) of a new building continues to become more important because efficient design alone is often not sufficient to deliver an efficient building. Simulation models that are calibrated to measured data can be used to evaluate the energy performance of new buildings if they are compared to energy baselines such as similar buildings, energy codes, and design standards. Unfortunately, there is a lack of detailed M&V methods and analysis methods to measure energy savings from new buildings that would have hypothetical energy baselines. Therefore, this study developed and demonstrated several new methodologies for evaluating the energy performance of new commercial buildings using a case-study building in Austin, Texas. First, three new M&V methods were developed to enhance the previous generic M&V framework for new buildings, including: (1) The development of a method to synthesize weather-normalized cooling energy use from a correlation of Motor Control Center (MCC) electricity use when chilled water use is unavailable, (2) The development of an improved method to analyze measured solar transmittance against incidence angle for sample glazing using different solar sensor types, including Eppley PSP and Li-Cor sensors, and (3) The development of an improved method to analyze chiller efficiency and operation at part-load conditions. Second, three new calibration methods were developed and analyzed, including: (1) A new percentile analysis added to the previous signature method for use with a DOE-2 calibration, (2) A new analysis to account for undocumented exhaust air in DOE-2 calibration, and (3) An analysis of the impact of synthesized direct normal solar radiation using the Erbs correlation on DOE-2 simulation. Third, an analysis of the actual energy savings compared to three different energy baselines was performed, including: (1) Energy Use Index (EUI) comparisons with sub-metered data, (2) New comparisons against Standards 90.1-1989 and 90.1-2001, and (3) A new evaluation of the performance of selected Energy Conservation Design Measures (ECDMs). Finally, potential energy savings were also simulated from selected improvements, including: minimum supply air flow, undocumented exhaust air, and daylighting.

  1. Capturing User Reading Behaviors for Personalized Document Summarization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Songhua; Jiang, Hao; Lau, Francis

    2011-01-01

    We propose a new personalized document summarization method that observes a user's personal reading preferences. These preferences are inferred from the user's reading behaviors, including facial expressions, gaze positions, and reading durations that were captured during the user's past reading activities. We compare the performance of our algorithm with that of a few peer algorithms and software packages. The results of our comparative study show that our algorithm can produce more superior personalized document summaries than all the other methods in that the summaries generated by our algorithm can better satisfy a user's personal preferences.

  2. Payment methods for outpatient care facilities.

    PubMed

    Yuan, Beibei; He, Li; Meng, Qingyue; Jia, Liying

    2017-03-03

    Outpatient care facilities provide a variety of basic healthcare services to individuals who do not require hospitalisation or institutionalisation, and are usually the patient's first contact. The provision of outpatient care contributes to immediate and large gains in health status, and a large portion of total health expenditure goes to outpatient healthcare services. Payment method is one of the most important incentive methods applied by purchasers to guide the performance of outpatient care providers. To assess the impact of different payment methods on the performance of outpatient care facilities and to analyse the differences in impact of payment methods in different settings. We searched the Cochrane Central Register of Controlled Trials (CENTRAL), 2016, Issue 3, part of the Cochrane Library (searched 8 March 2016); MEDLINE, OvidSP (searched 8 March 2016); Embase, OvidSP (searched 24 April 2014); PubMed (NCBI) (searched 8 March 2016); Dissertations and Theses Database, ProQuest (searched 8 March 2016); Conference Proceedings Citation Index (ISI Web of Science) (searched 8 March 2016); IDEAS (searched 8 March 2016); EconLit, ProQuest (searched 8 March 2016); POPLINE, K4Health (searched 8 March 2016); China National Knowledge Infrastructure (searched 8 March 2016); Chinese Medicine Premier (searched 8 March 2016); OpenGrey (searched 8 March 2016); ClinicalTrials.gov, US National Institutes of Health (NIH) (searched 8 March 2016); World Health Organization (WHO) International Clinical Trials Registry Platform (ICTRP) (searched 8 March 2016); and the website of the World Bank (searched 8 March 2016).In addition, we searched the reference lists of included studies and carried out a citation search for the included studies via ISI Web of Science to find other potentially relevant studies. We also contacted authors of the main included studies regarding any further published or unpublished work. Randomised trials, non-randomised trials, controlled before-after studies, interrupted time series, and repeated measures studies that compared different payment methods for outpatient health facilities. We defined outpatient care facilities in this review as facilities that provide health services to individuals who do not require hospitalisation or institutionalisation. We only included methods used to transfer funds from the purchaser of healthcare services to health facilities (including groups of individual professionals). These include global budgets, line-item budgets, capitation, fee-for-service (fixed and unconstrained), pay for performance, and mixed payment. The primary outcomes were service provision outcomes, patient outcomes, healthcare provider outcomes, costs for providers, and any adverse effects. At least two review authors independently extracted data and assessed the risk of bias. We conducted a structured synthesis. We first categorised the comparisons and outcomes and then described the effects of different types of payment methods on different categories of outcomes. We used a fixed-effect model for meta-analysis within a study if a study included more than one indicator in the same category of outcomes. We used a random-effects model for meta-analysis across studies. If the data for meta-analysis were not available in some studies, we calculated the median and interquartile range. We reported the risk ratio (RR) for dichotomous outcomes and the relative change for continuous outcomes. We included 21 studies from Afghanistan, Burundi, China, Democratic Republic of Congo, Rwanda, Tanzania, the United Kingdom, and the United States of health facilities providing primary health care and mental health care. There were three kinds of payment comparisons. 1) Pay for performance (P4P) combined with some existing payment method (capitation or different kinds of input-based payment) compared to the existing payment methodWe included 18 studies in this comparison, however we did not include five studies in the effects analysis due to high risk of bias. From the 13 studies, we found that the extra P4P incentives probably slightly improved the health professionals' use of some tests and treatments (adjusted RR median = 1.095, range 1.01 to 1.17; moderate-certainty evidence), and probably led to little or no difference in adherence to quality assurance criteria (adjusted percentage change median = -1.345%, range -8.49% to 5.8%; moderate-certainty evidence). We also found that P4P incentives may have led to little or no difference in patients' utilisation of health services (adjusted RR median = 1.01, range 0.96 to 1.15; low-certainty evidence) and may have led to little or no difference in the control of blood pressure or cholesterol (adjusted RR = 1.01, range 0.98 to 1.04; low-certainty evidence). 2) Capitation combined with P4P compared to fee-for-service (FFS)One study found that compared with FFS, a capitated budget combined with payment based on providers' performance on antibiotic prescriptions and patient satisfaction probably slightly reduced antibiotic prescriptions in primary health facilities (adjusted RR 0.84, 95% confidence interval 0.74 to 0.96; moderate-certainty evidence). 3) Capitation compared to FFSTwo studies compared capitation to FFS in mental health centres in the United States. Based on these studies, the effects of capitation compared to FFS on the utilisation and costs of services were uncertain (very low-certainty evidence). Our review found that if policymakers intend to apply P4P incentives to pay health facilities providing outpatient services, this intervention will probably lead to a slight improvement in health professionals' use of tests or treatments, particularly for chronic diseases. However, it may lead to little or no improvement in patients' utilisation of health services or health outcomes. When considering using P4P to improve the performance of health facilities, policymakers should carefully consider each component of their P4P design, including the choice of performance measures, the performance target, payment frequency, if there will be additional funding, whether the payment level is sufficient to change the behaviours of health providers, and whether the payment to facilities will be allocated to individual professionals. Unfortunately, the studies included in this review did not help to inform those considerations.Well-designed comparisons of different payment methods for outpatient health facilities in low- and middle-income countries and studies directly comparing different designs (e.g. different payment levels) of the same payment method (e.g. P4P or FFS) are needed.

  3. Parametric Methods for Dynamic 11C-Phenytoin PET Studies.

    PubMed

    Mansor, Syahir; Yaqub, Maqsood; Boellaard, Ronald; Froklage, Femke E; de Vries, Anke; Bakker, Esther D M; Voskuyl, Rob A; Eriksson, Jonas; Schwarte, Lothar A; Verbeek, Joost; Windhorst, Albert D; Lammertsma, Adriaan A

    2017-03-01

    In this study, the performance of various methods for generating quantitative parametric images of dynamic 11 C-phenytoin PET studies was evaluated. Methods: Double-baseline 60-min dynamic 11 C-phenytoin PET studies, including online arterial sampling, were acquired for 6 healthy subjects. Parametric images were generated using Logan plot analysis, a basis function method, and spectral analysis. Parametric distribution volume (V T ) and influx rate ( K 1 ) were compared with those obtained from nonlinear regression analysis of time-activity curves. In addition, global and regional test-retest (TRT) variability was determined for parametric K 1 and V T values. Results: Biases in V T observed with all parametric methods were less than 5%. For K 1 , spectral analysis showed a negative bias of 16%. The mean TRT variabilities of V T and K 1 were less than 10% for all methods. Shortening the scan duration to 45 min provided similar V T and K 1 with comparable TRT performance compared with 60-min data. Conclusion: Among the various parametric methods tested, the basis function method provided parametric V T and K 1 values with the least bias compared with nonlinear regression data and showed TRT variabilities lower than 5%, also for smaller volume-of-interest sizes (i.e., higher noise levels) and shorter scan duration. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  4. Antenatal breastfeeding education for increasing breastfeeding duration

    PubMed Central

    Lumbiganon, Pisake; Martis, Ruth; Laopaiboon, Malinee; Festin, Mario R; Ho, Jacqueline J; Hakimi, Mohammad

    2014-01-01

    Background Breastfeeding (BF) is well recognised as the best food for infants. The impact of antenatal BF education on the duration of BF has not been evaluated. Objectives To evaluate the effectiveness of antenatal BF education for increasing BF initiation and duration. Search methods We searched the Cochrane Pregnancy and Childbirth Group’s Trials Register (21 April 2010), CENTRAL (The Cochrane Library 2010, Issue 2), MEDLINE (1966 to April 2010) and SCOPUS (January 1985 to April 2010). We contacted experts and searched reference lists of retrieved articles. We updated the search of the Pregnancy and Childbirth Group’s Trials Register on 28 September 2011 and added the results to the awaiting classification section of the review. Selection criteria All identified published, unpublished and ongoing randomised controlled trials (RCTs) assessing the effect of formal antenatal BF education or comparing two different methods of formal antenatal BF education, on duration of BF. We excluded RCTs that also included intrapartum or postpartum BF education. Data collection and analysis We assessed all potential studies identified as a result of the search strategy. Two review authors extracted data from each included study using the agreed form and assessed risk of bias. We resolved discrepancies through discussion. Main results We included 17 studies with 7131 women in the review and 14 studies involving 6932 women contributed data to the analyses. We did not do any meta-analysis because there was only one study for each comparison. Five studies compared a single method of BF education with routine care. Peer counselling significantly increased BF initiation. Three studies compared one form of BF education versus another. No intervention was significantly more effective than another intervention in increasing initiation or duration of BF. Seven studies compared multiple methods versus a single method of BF education. Combined BF educational interventions were not significantly better than a single intervention in initiating or increasing BF duration. However, in one trial a combined BF education significantly reduced nipple pain and trauma. One study compared different combinations of interventions. There was a marginally significant increase in exclusive BF at six months in women receiving a booklet plus video plus lactation consultation (LC) compared with the booklet plus video only. Two studies compared multiple methods of BF education versus routine care. The combination of BF booklet plus video plus LC was significantly better than routine care for exclusive BF at three months. Authors’ conclusions Because there were significant methodological limitations and the observed effect sizes were small, it is not appropriate to recommend any antenatal BF education. There is an urgent need to conduct RCTs study with adequate power to evaluate the effectiveness of antenatal BF education. PMID:22071830

  5. Identification of discriminant proteins through antibody profiling, methods and apparatus for identifying an individual

    DOEpatents

    Apel, William A.; Thompson, Vicki S; Lacey, Jeffrey A.; Gentillon, Cynthia A.

    2016-08-09

    A method for determining a plurality of proteins for discriminating and positively identifying an individual based from a biological sample. The method may include profiling a biological sample from a plurality of individuals against a protein array including a plurality of proteins. The protein array may include proteins attached to a support in a preselected pattern such that locations of the proteins are known. The biological sample may be contacted with the protein array such that a portion of antibodies in the biological sample reacts with and binds to the proteins forming immune complexes. A statistical analysis method, such as discriminant analysis, may be performed to determine discriminating proteins for distinguishing individuals. Proteins of interest may be used to form a protein array. Such a protein array may be used, for example, to compare a forensic sample from an unknown source with a sample from a known source.

  6. Identification of discriminant proteins through antibody profiling, methods and apparatus for identifying an individual

    DOEpatents

    Thompson, Vicki S; Lacey, Jeffrey A; Gentillon, Cynthia A; Apel, William A

    2015-03-03

    A method for determining a plurality of proteins for discriminating and positively identifying an individual based from a biological sample. The method may include profiling a biological sample from a plurality of individuals against a protein array including a plurality of proteins. The protein array may include proteins attached to a support in a preselected pattern such that locations of the proteins are known. The biological sample may be contacted with the protein array such that a portion of antibodies in the biological sample reacts with and binds to the proteins forming immune complexes. A statistical analysis method, such as discriminant analysis, may be performed to determine discriminating proteins for distinguishing individuals. Proteins of interest may be used to form a protein array. Such a protein array may be used, for example, to compare a forensic sample from an unknown source with a sample from a known source.

  7. SU-E-J-221: A Novel Expansion Method for MRI Based Target Delineation in Prostate Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruiz, B; East Carolina University, Greenville, NC; Feng, Y

    Purpose: To compare a novel bladder/rectum carveout expansion method on MRI delineated prostate to standard CT and expansion based methods for maintaining prostate coverage while providing superior bladder and rectal sparing. Methods: Ten prostate cases were planned to include four trials: MRI vs CT delineated prostate/proximal seminal vesicles, and each image modality compared to both standard expansions (8mm 3D expansion and 5mm posterior, i.e. ∼8mm) and carveout method expansions (5mm 3D expansion, 4mm posterior for GTV-CTV excluding expansion into bladder/rectum followed by additional 5mm 3D expansion to PTV, i.e. ∼1cm). All trials were planned to total dose 7920 cGy viamore » IMRT. Evaluation and comparison was made using the following criteria: QUANTEC constraints for bladder/rectum including analysis of low dose regions, changes in PTV volume, total control points, and maximum hot spot. Results: ∼8mm MRI expansion consistently produced the most optimal plan with lowest total control points and best bladder/rectum sparing. However, this scheme had the smallest prostate (average 22.9% reduction) and subsequent PTV volume, consistent with prior literature. ∼1cm MRI had an average PTV volume comparable to ∼8mm CT at 3.79% difference. Bladder QUANTEC constraints were on average less for the ∼1cm MRI as compared to the ∼8mm CT and observed as statistically significant with 2.64% reduction in V65. Rectal constraints appeared to follow the same trend. Case-by-case analysis showed variation in rectal V30 with MRI delineated prostate being most favorable regardless of expansion type. ∼1cm MRI and ∼8mm CT had comparable plan quality. Conclusion: MRI delineated prostate with standard expansions had the smallest PTV leading to margins that may be too tight. Bladder/rectum carveout expansion method on MRI delineated prostate was found to be superior to standard CT based methods in terms of bladder and rectal sparing while maintaining prostate coverage. Continued investigation is warranted for further validation.« less

  8. A method for estimating vertical distibution of the SAGE II opaque cloud frequency

    NASA Technical Reports Server (NTRS)

    Wang, Pi-Huan; Mccormick, M. Patrick; Minnis, Patrick; Kent, Geoffrey S.; Yue, Glenn K.; Skeens, Kristi M.

    1995-01-01

    A method is developed to infer the vertical distribution of the occurrence frequency of clouds that are opaque to the Stratospheric Aerosol and Gas Experiment (SAGE) II instrument. An application of the method to the 1986 SAGE II observations is included in this paper. The 1986 SAGE II results are compared with the 1952-1981 cloud climatology of Warren et al. (1986, 1988)

  9. Evaluation of equipment and methods to map lost circulation zones in geothermal wells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, W.J.; Leon, P.A.; Pittard, G.

    A study and evaluation of methods to locate, characterize, and quantify lost circulation zones are described. Twenty-five methods of mapping and quantifying lost circulation zones were evaluated, including electrical, acoustical, mechanical, radioactive, and optical systems. Each tool studied is described. The structured, numerical evaluation plan, used as the basis for comparing the 25 tools, and the resulting ranking among the tools is presented.

  10. Comparison of a gel microcolumn assay with the conventional tube test for red blood cell alloantibody titration.

    PubMed

    Finck, Rachel; Lui-Deguzman, Carrie; Teng, Shih-Mao; Davis, Rebecca; Yuan, Shan

    2013-04-01

    Titration is a semiquantitative method used to estimate red blood cell (RBC) alloantibody reactivity. The conventional tube test (CTT) technique is the traditional method for performing titration studies. The gel microcolumn assay (GMA) is also a sensitive method to detect RBC alloantibodies. The aim of this study was to compare a GMA with the CTT technique in the performance of Rh and K alloantibody titration. Patient serum samples that contained an RBC alloantibody with a singular specificity were identified by routine blood bank workflow. Parallel titration studies were performed on these samples by both the CTT method and a GMA (ID-Micro Typing System anti-IgG gel card, Micro Typing Systems, Inc., an Ortho-Clinical Diagnostics Company). Forty-eight samples were included, including 11 anti-D, five anti-c, 13 anti-E, one anti-C, three anti-e, and 15 anti-K. Overall, the two methods generated identical results in 21 of 48 samples. For 42 samples (87.5%) the two methods generated results that were within one serial dilution, and for the remaining six samples, results were within two dilutions. GMA systems may perform comparably to the CTT in titrating alloantibodies to Rh and Kell antigens. © 2012 American Association of Blood Banks.

  11. Comparing four non-invasive methods to determine the ventilatory anaerobic threshold during cardiopulmonary exercise testing in children with congenital heart or lung disease.

    PubMed

    Visschers, Naomi C A; Hulzebos, Erik H; van Brussel, Marco; Takken, Tim

    2015-11-01

    The ventilatory anaerobic threshold (VAT) is an important method to assess the aerobic fitness in patients with cardiopulmonary disease. Several methods exist to determine the VAT; however, there is no consensus which of these methods is the most accurate. To compare four different non-invasive methods for the determination of the VAT via respiratory gas exchange analysis during a cardiopulmonary exercise test (CPET). A secondary objective is to determine the interobserver reliability of the VAT. CPET data of 30 children diagnosed with either cystic fibrosis (CF; N = 15) or with a surgically corrected dextro-transposition of the great arteries (asoTGA; N = 15) were included. No significant differences were found between conditions or among testers. The RER = 1 method differed the most compared to the other methods, showing significant higher results in all six variables. The PET-O2 method differed significantly on five of six and four of six exercise variables with the V-slope method and the VentEq method, respectively. The V-slope and the VentEq method differed significantly on one of six exercise variables. Ten of thirteen ICCs that were >0.80 had a 95% CI > 0.70. The RER = 1 method and the V-slope method had the highest number of significant ICCs and 95% CIs. The V-slope method, the ventilatory equivalent method and the PET-O2 method are comparable and reliable methods to determine the VAT during CPET in children with CF or asoTGA. © 2014 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  12. How does tele-learning compare with other forms of education delivery? A systematic review of tele-learning educational outcomes for health professionals.

    PubMed

    Tomlinson, Jo; Shaw, Tim; Munro, Ana; Johnson, Ros; Madden, D Lynne; Phillips, Rosemary; McGregor, Deborah

    2013-11-01

    Telecommuniciation technologies, including audio and videoconferencing facilities, afford geographically dispersed health professionals the opportunity to connect and collaborate with others. Recognised for enabling tele-consultations and tele-collaborations between teams of health care professionals and their patients, these technologies are also well suited to the delivery of distance learning programs, known as tele-learning. To determine whether tele-learning delivery methods achieve equivalent learning outcomes when compared with traditional face-to-face education delivery methods. A systematic literature review was commissioned by the NSW Ministry of Health to identify results relevant to programs applying tele-learning delivery methods in the provision of education to health professionals. The review found few studies that rigorously compared tele-learning with traditional formats. There was some evidence, however, to support the premise that tele-learning models achieve comparable learning outcomes and that participants are generally satisfied with and accepting of this delivery method. The review illustrated that tele-learning technologies not only enable distance learning opportunities, but achieve comparable learning outcomes to traditional face-to-face models. More rigorous evidence is required to strengthen these findings and should be the focus of future tele-learning research.

  13. Cost effectiveness of on- and off-field conservation practices designed to reduce nitrogen in downstream water

    USDA-ARS?s Scientific Manuscript database

    The objective of this analysis is to estimate and compare the cost-effectiveness of on- and off-field approaches to reducing nitrogen loadings. On-field practices include improving the timing, rate, and method of nitrogen application. Off-field practices include restoring wetlands and establishing v...

  14. Hormonal and intrauterine methods for contraception for women aged 25 years and younger.

    PubMed

    Krashin, Jamie; Tang, Jennifer H; Mody, Sheila; Lopez, Laureen M

    2015-08-17

    Women between the ages of 15 and 24 years have high rates of unintended pregnancy; over half of women in this age group want to avoid pregnancy. However, women under age 25 years have higher typical contraceptive failure rates within the first 12 months of use than older women. High discontinuation rates may also be a problem in this population. Concern that adolescents and young women will not find hormonal or intrauterine contraceptives acceptable or effective might deter healthcare providers from recommending these contraceptive methods. To compare the contraceptive failure (pregnancy) rates and to examine the continuation rates for hormonal and intrauterine contraception among young women aged 25 years and younger. We searched until 4 August 2015 for randomized controlled trials (RCTs) that compared hormonal or intrauterine methods of contraception in women aged 25 years and younger. Computerized databases included the Cochrane Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE, POPLINE, CINAHL, and LILACS. We also searched for current trials via ClinicalTrials.gov and the International Clinical Trials Registry Platform (ICTRP). We considered RCTs in any language that reported the contraceptive failure rates for hormonal or intrauterine contraceptive methods, when compared with another contraceptive method, for women aged 25 years and younger. The other contraceptive method could have been another intrauterine contraceptive, another hormonal contraceptive or different dose of the same method, or a non-hormonal contraceptive. Treatment duration must have been at least three months. Eligible trials had to include the primary outcome of contraceptive failure rate (pregnancy). The secondary outcome was contraceptive continuation rate. One author conducted the primary data extraction and entered the information into Review Manager. Another author performed an independent data extraction and verified the initial entry. For dichotomous outcomes, we computed the Mantel-Haenszel odds ratio (OR) with 95% confidence interval (CI). Because of disparate interventions and outcome measures, we did not conduct meta-analysis. Five trials met the inclusion criteria. The studies included a total of 1503 women, with a mean of 301 participants. The trials compared the following contraceptives: combined oral contraceptive (COC) versus transdermal contraceptive patch, vaginal contraceptive ring, or levonorgestrel intrauterine system 20 µg/day (LNG-IUS 20); LNG-IUS 12 µg/day (LNG-IUS 12) versus LNG-IUS 16 µg/day (LNG-IUS 16); and LNG-IUS 20 versus the copper T380A intrauterine device (IUD). In the trials comparing two different types of methods, the study arms did not differ significantly for contraceptive efficacy or continuation. The sample sizes were small for two of those studies. The only significant outcome was that a COC group had a higher proportion of women who discontinued for 'other personal reasons' compared with the group assigned to the LNG-IUS 20 (OR 0.27, 95% CI 0.09 to 0.85), which may have little clinic relevance. The trial comparing LNG-IUS 12 versus LNG-IUS 16 showed similar efficacy over one and three years. In three trials that examined different LNG-IUS, continuation was at least 75% at 6 to 36 months. We considered the overall quality of evidence to be moderate to low. Limitations were due to trial design or limited reporting. Different doses in the LNG-IUS did not appear to influence efficacy over three years. In another study, continuation of the LNG-IUS appeared at least as high as that for the COC. The current evidence was insufficient to compare efficacy and continuation rates for hormonal and intrauterine contraceptive methods in women aged 25 years and younger.

  15. An evaluation of unsupervised and supervised learning algorithms for clustering landscape types in the United States

    USGS Publications Warehouse

    Wendel, Jochen; Buttenfield, Barbara P.; Stanislawski, Larry V.

    2016-01-01

    Knowledge of landscape type can inform cartographic generalization of hydrographic features, because landscape characteristics provide an important geographic context that affects variation in channel geometry, flow pattern, and network configuration. Landscape types are characterized by expansive spatial gradients, lacking abrupt changes between adjacent classes; and as having a limited number of outliers that might confound classification. The US Geological Survey (USGS) is exploring methods to automate generalization of features in the National Hydrography Data set (NHD), to associate specific sequences of processing operations and parameters with specific landscape characteristics, thus obviating manual selection of a unique processing strategy for every NHD watershed unit. A chronology of methods to delineate physiographic regions for the United States is described, including a recent maximum likelihood classification based on seven input variables. This research compares unsupervised and supervised algorithms applied to these seven input variables, to evaluate and possibly refine the recent classification. Evaluation metrics for unsupervised methods include the Davies–Bouldin index, the Silhouette index, and the Dunn index as well as quantization and topographic error metrics. Cross validation and misclassification rate analysis are used to evaluate supervised classification methods. The paper reports the comparative analysis and its impact on the selection of landscape regions. The compared solutions show problems in areas of high landscape diversity. There is some indication that additional input variables, additional classes, or more sophisticated methods can refine the existing classification.

  16. Evidence-based risk communication: a systematic review.

    PubMed

    Zipkin, Daniella A; Umscheid, Craig A; Keating, Nancy L; Allen, Elizabeth; Aung, KoKo; Beyth, Rebecca; Kaatz, Scott; Mann, Devin M; Sussman, Jeremy B; Korenstein, Deborah; Schardt, Connie; Nagi, Avishek; Sloane, Richard; Feldstein, David A

    2014-08-19

    Effective communication of risks and benefits to patients is critical for shared decision making. To review the comparative effectiveness of methods of communicating probabilistic information to patients that maximize their cognitive and behavioral outcomes. PubMed (1966 to March 2014) and CINAHL, EMBASE, and the Cochrane Central Register of Controlled Trials (1966 to December 2011) using several keywords and structured terms. Prospective or cross-sectional studies that recruited patients or healthy volunteers and compared any method of communicating probabilistic information with another method. Two independent reviewers extracted study characteristics and assessed risk of bias. Eighty-four articles, representing 91 unique studies, evaluated various methods of numerical and visual risk display across several risk scenarios and with diverse outcome measures. Studies showed that visual aids (icon arrays and bar graphs) improved patients' understanding and satisfaction. Presentations including absolute risk reductions were better than those including relative risk reductions for maximizing accuracy and seemed less likely than presentations with relative risk reductions to influence decisions to accept therapy. The presentation of numbers needed to treat reduced understanding. Comparative effects of presentations of frequencies (such as 1 in 5) versus event rates (percentages, such as 20%) were inconclusive. Most studies were small and highly variable in terms of setting, context, and methods of administering interventions. Visual aids and absolute risk formats can improve patients' understanding of probabilistic information, whereas numbers needed to treat can lessen their understanding. Due to study heterogeneity, the superiority of any single method for conveying probabilistic information is not established, but there are several good options to help clinicians communicate with patients. None.

  17. Effectiveness of feature and classifier algorithms in character recognition systems

    NASA Astrophysics Data System (ADS)

    Wilson, Charles L.

    1993-04-01

    At the first Census Optical Character Recognition Systems Conference, NIST generated accuracy data for more than character recognition systems. Most systems were tested on the recognition of isolated digits and upper and lower case alphabetic characters. The recognition experiments were performed on sample sizes of 58,000 digits, and 12,000 upper and lower case alphabetic characters. The algorithms used by the 26 conference participants included rule-based methods, image-based methods, statistical methods, and neural networks. The neural network methods included Multi-Layer Perceptron's, Learned Vector Quantitization, Neocognitrons, and cascaded neural networks. In this paper 11 different systems are compared using correlations between the answers of different systems, comparing the decrease in error rate as a function of confidence of recognition, and comparing the writer dependence of recognition. This comparison shows that methods that used different algorithms for feature extraction and recognition performed with very high levels of correlation. This is true for neural network systems, hybrid systems, and statistically based systems, and leads to the conclusion that neural networks have not yet demonstrated a clear superiority to more conventional statistical methods. Comparison of these results with the models of Vapnick (for estimation problems), MacKay (for Bayesian statistical models), Moody (for effective parameterization), and Boltzmann models (for information content) demonstrate that as the limits of training data variance are approached, all classifier systems have similar statistical properties. The limiting condition can only be approached for sufficiently rich feature sets because the accuracy limit is controlled by the available information content of the training set, which must pass through the feature extraction process prior to classification.

  18. Review of Artificial Abrasion Test Methods for PV Module Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David C.; Muller, Matt T.; Simpson, Lin J.

    This review is intended to identify the method or methods--and the basic details of those methods--that might be used to develop an artificial abrasion test. Methods used in the PV literature were compared with their closest implementation in existing standards. Also, meetings of the International PV Quality Assurance Task Force Task Group 12-3 (TG12-3, which is concerned with coated glass) were used to identify established test methods. Feedback from the group, which included many of the authors from the PV literature, included insights not explored within the literature itself. The combined experience and examples from the literature are intended tomore » provide an assessment of the present industry practices and an informed path forward. Recommendations toward artificial abrasion test methods are then identified based on the experiences in the literature and feedback from the PV community. The review here is strictly focused on abrasion. Assessment methods, including optical performance (e.g., transmittance or reflectance), surface energy, and verification of chemical composition were not examined. Methods of artificially soiling PV modules or other specimens were not examined. The weathering of artificial or naturally soiled specimens (which may ultimately include combined temperature and humidity, thermal cycling and ultraviolet light) were also not examined. A sense of the purpose or application of an abrasion test method within the PV industry should, however, be evident from the literature.« less

  19. Determination and discrimination of biodiesel fuels by gas chromatographic and chemometric methods

    NASA Astrophysics Data System (ADS)

    Milina, R.; Mustafa, Z.; Bojilov, D.; Dagnon, S.; Moskovkina, M.

    2016-03-01

    Pattern recognition method (PRM) was applied to gas chromatographic (GC) data for a fatty acid methyl esters (FAME) composition of commercial and laboratory synthesized biodiesel fuels from vegetable oils including sunflower, rapeseed, corn and palm oils. Two GC quantitative methods to calculate individual fames were compared: Area % and internal standard. The both methods were applied for analysis of two certified reference materials. The statistical processing of the obtained results demonstrates the accuracy and precision of the two methods and allows them to be compared. For further chemometric investigations of biodiesel fuels by their FAME-profiles any of those methods can be used. PRM results of FAME profiles of samples from different vegetable oils show a successful recognition of biodiesels according to the feedstock. The information obtained can be used for selection of feedstock to produce biodiesels with certain properties, for assessing their interchangeability, for fuel spillage and remedial actions in the environment.

  20. Hypothesis Testing Using Factor Score Regression

    PubMed Central

    Devlieger, Ines; Mayer, Axel; Rosseel, Yves

    2015-01-01

    In this article, an overview is given of four methods to perform factor score regression (FSR), namely regression FSR, Bartlett FSR, the bias avoiding method of Skrondal and Laake, and the bias correcting method of Croon. The bias correcting method is extended to include a reliable standard error. The four methods are compared with each other and with structural equation modeling (SEM) by using analytic calculations and two Monte Carlo simulation studies to examine their finite sample characteristics. Several performance criteria are used, such as the bias using the unstandardized and standardized parameterization, efficiency, mean square error, standard error bias, type I error rate, and power. The results show that the bias correcting method, with the newly developed standard error, is the only suitable alternative for SEM. While it has a higher standard error bias than SEM, it has a comparable bias, efficiency, mean square error, power, and type I error rate. PMID:29795886

  1. Comparative In Vitro Efficacy of Doripenem and Imipenem Against Multi-Drug Resistant Pseudomonas aeruginosa.

    PubMed

    Wali, Nadia; Mirza, Irfan Ali

    2016-04-01

    To compare the in vitro efficacy of doripenem and imipenem against multi-drug resistant (MDR) Pseudomonas aeruginosa from various clinical specimens. Descriptive cross-sectional study. Department of Microbiology, Armed Forces Institute of Pathology, Rawalpindi, from November 2012 to November 2013. MDR Pseudomonas aeruginosa isolates from various clinical samples were included in the study. Susceptibility of Pseudomonas aeruginosa against doripenem and imipenem was performed by E-test strip and agar dilution methods. The results were interpreted as recommended by Clinical Laboratory Standard Institute (CLSI) guidelines. The maximum number of Pseudomonas aeruginosa were isolated from pure pus and pus swabs. In vitro efficacy of doripenem was found to be more effective as compared to imipenem against MDR Pseudomonas aeruginosa with both E-test strip and agar dilution methods. Overall, p-values of 0.014 and 0.037 were observed when susceptibility patterns of doripenem and imipenem were evaluated with E-test strip and agar dilution methods. In vitro efficacy of doripenem was found to be better against MDR Pseudomonas aeruginosaas compared to imipenem when tested by both E-test and agar dilution methods.

  2. Comparison of transect sampling and object-oriented image classification methods of urbanizing catchments

    NASA Astrophysics Data System (ADS)

    Yang, Y.; Tenenbaum, D. E.

    2009-12-01

    The process of urbanization has major effects on both human and natural systems. In order to monitor these changes and better understand how urban ecological systems work, urban spatial structure and the variation needs to be first quantified at a fine scale. Because the land-use and land-cover (LULC) in urbanizing areas is highly heterogeneous, the classification of urbanizing environments is the most challenging field in remote sensing. Although a pixel-based method is a common way to do classification, the results are not good enough for many research objectives which require more accurate classification data in fine scales. Transect sampling and object-oriented classification methods are more appropriate for urbanizing areas. Tenenbaum used a transect sampling method using a computer-based facility within a widely available commercial GIS in the Glyndon Catchment and the Upper Baismans Run Catchment, Baltimore, Maryland. It was a two-tiered classification system, including a primary level (which includes 7 classes) and a secondary level (which includes 37 categories). The statistical information of LULC was collected. W. Zhou applied an object-oriented method at the parcel level in Gwynn’s Falls Watershed which includes the two previously mentioned catchments and six classes were extracted. The two urbanizing catchments are located in greater Baltimore, Maryland and drain into Chesapeake Bay. In this research, the two different methods are compared for 6 classes (woody, herbaceous, water, ground, pavement and structure). The comparison method uses the segments in the transect method to extract LULC information from the results of the object-oriented method. Classification results were compared in order to evaluate the difference between the two methods. The overall proportions of LULC classes from the two studies show that there is overestimation of structures in the object-oriented method. For the other five classes, the results from the two methods are similar, except for a difference in the proportions of the woody class. The segment to segment comparison shows that the resolution of the light detection and ranging (LIDAR) data used in the object-oriented method does affect the accuracy of the classification. Shadows of trees and structures are still a big problem in the object-oriented method. For classes that make up a small proportion of the catchments, such as water, neither method was capable of detecting them.

  3. Transforming Elementary Science Teacher Education by Bridging Formal and Informal Science Education in an Innovative Science Methods Course

    NASA Astrophysics Data System (ADS)

    Riedinger, Kelly; Marbach-Ad, Gili; Randy McGinnis, J.; Hestness, Emily; Pease, Rebecca

    2011-02-01

    We investigated curricular and pedagogical innovations in an undergraduate science methods course for elementary education majors at the University of Maryland. The goals of the innovative elementary science methods course included: improving students' attitudes toward and views of science and science teaching, to model innovative science teaching methods and to encourage students to continue in teacher education. We redesigned the elementary science methods course to include aspects of informal science education. The informal science education course features included informal science educator guest speakers, a live animal demonstration and a virtual field trip. We compared data from a treatment course ( n = 72) and a comparison course ( n = 26). Data collection included: researchers' observations, instructors' reflections, and teacher candidates' feedback. Teacher candidate feedback involved interviews and results on a reliable and valid Attitudes and Beliefs about the Nature of and the Teaching of Science instrument. We used complementary methods to analyze the data collected. A key finding of the study was that while benefits were found in both types of courses, the difference in results underscores the need of identifying the primary purpose for innovation as a vital component of consideration.

  4. Application of multiattribute decision-making methods for the determination of relative significance factor of impact categories.

    PubMed

    Noh, Jaesung; Lee, Kun Mo

    2003-05-01

    A relative significance factor (f(i)) of an impact category is the external weight of the impact category. The objective of this study is to propose a systematic and easy-to-use method for the determination of f(i). Multiattribute decision-making (MADM) methods including the analytical hierarchy process (AHP), the rank-order centroid method, and the fuzzy method were evaluated for this purpose. The results and practical aspects of using the three methods are compared. Each method shows the same trend, with minor differences in the value of f(i). Thus, all three methods can be applied to the determination of f(i). The rank order centroid method reduces the number of pairwise comparisons by placing the alternatives in order, although it has inherent weakness over the fuzzy method in expressing the degree of vagueness associated with assigning weights to criteria and alternatives. The rank order centroid method is considered a practical method for the determination of f(i) because it is easier and simpler to use compared to the AHP and the fuzzy method.

  5. Comparative evaluation of two quantitative test methods for determining the efficacy of liquid sporicides and sterilants on a hard surface: a precollaborative study.

    PubMed

    Tomasino, Stephen F; Hamilton, Martin A

    2007-01-01

    Two quantitative carrier-based test methods for determining the efficacy of liquid sporicides and sterilants on a hard surface, the Standard Quantitative Carrier Test Method-ASTM E 2111-00 and an adaptation of a quantitative micro-method as reported by Sagripanti and Bonifacino, were compared in this study. The methods were selected based on their desirable characteristics (e.g., well-developed protocol, previous use with spores, fully quantitative, and use of readily available equipment) for testing liquid sporicides and sterilants on a hard surface. In this paper, the Sagripanti-Bonifacino procedure is referred to as the Three Step Method (TSM). AOAC Official Method 966.04 was included in this study as a reference method. Three laboratories participated in the evaluation. Three chemical treatments were tested: (1) 3000 ppm sodium hypochlorite with pH adjusted to 7.0, (2) a hydrogen peroxide/peroxyacetic acid product, and (3) 3000 ppm sodium hypochlorite with pH unadjusted (pH of approximately 10.0). A fourth treatment, 6000 ppm sodium hypochlorite solution with pH adjusted to 7.0, was included only for Method 966.04 as a positive control (high level of efficacy). The contact time was 10 min for all chemical treatments except the 6000 ppm sodium hypochlorite treatment which was tested at 30 min. Each chemical treatment was tested 3 times using each of the methods. Only 2 of the laboratories performed the AOAC method. Method performance was assessed by the within-laboratory variance, between-laboratory variance, and total variance associated with the log reduction (LR) estimates generated by each quantitative method. The quantitative methods performed similarly, and the LR values generated by each method were not statistically different for the 3 treatments evaluated. Based on feedback from the participating laboratories, compared to the TSM, ASTM E 2111-00 was more resource demanding and required more set-up time. The logistical and resource concerns identified for ASTM E 2111-00 were largely associated with the filtration process and counting bacterial colonies on filters. Thus, the TSM was determined to be the most suitable method.

  6. Validation sampling can reduce bias in health care database studies: an illustration using influenza vaccination effectiveness.

    PubMed

    Nelson, Jennifer Clark; Marsh, Tracey; Lumley, Thomas; Larson, Eric B; Jackson, Lisa A; Jackson, Michael L

    2013-08-01

    Estimates of treatment effectiveness in epidemiologic studies using large observational health care databases may be biased owing to inaccurate or incomplete information on important confounders. Study methods that collect and incorporate more comprehensive confounder data on a validation cohort may reduce confounding bias. We applied two such methods, namely imputation and reweighting, to Group Health administrative data (full sample) supplemented by more detailed confounder data from the Adult Changes in Thought study (validation sample). We used influenza vaccination effectiveness (with an unexposed comparator group) as an example and evaluated each method's ability to reduce bias using the control time period before influenza circulation. Both methods reduced, but did not completely eliminate, the bias compared with traditional effectiveness estimates that do not use the validation sample confounders. Although these results support the use of validation sampling methods to improve the accuracy of comparative effectiveness findings from health care database studies, they also illustrate that the success of such methods depends on many factors, including the ability to measure important confounders in a representative and large enough validation sample, the comparability of the full sample and validation sample, and the accuracy with which the data can be imputed or reweighted using the additional validation sample information. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Comparison of Creativity and Self-Esteem in Students with Employed and Household Mothers

    ERIC Educational Resources Information Center

    Safara, Maryam; Alkaran, Zeinab Blori; Salmabadi, Mojtaba; Rostami, Najmieh

    2017-01-01

    Objective: The present study was carried out to compare creativity and self-esteem in the university students with employed and household mothers in academic years 2014-2015. Method: This research is a descriptive one which is of comparative-casual type. The statistical population includes all undergraduate students of Azad universities of…

  8. Comparing Methods for Assessing Receptive Language Skills in Minimally Verbal Children and Adolescents with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Plesa Skwerer, Daniela; Jordan, Samantha E.; Brukilacchio, Briana H.; Tager-Flusberg, Helen

    2016-01-01

    This research addresses the challenges of assessing receptive language abilities in minimally verbal children with autism spectrum disorder by comparing several adapted measurement tools: a standardized direct assessment of receptive vocabulary (i.e. Peabody Picture Vocabulary Test-4); caregiver report measures including scores on the Vineland-II…

  9. A Comparative Case Study of Developing Leaders through a Doctoral Program: A Study of One Academic Institution

    ERIC Educational Resources Information Center

    Scanlon, Sheryl Lynne

    2012-01-01

    The purpose of this comparative case study was to determine how one academic institution could address the leadership gap facing organizations today, through a traditional, classroom doctoral program in Organizational Leadership. Data was gathered utilizing mixed methods methodology that included a survey questionnaire, focus group information,…

  10. Translating Knowledge through Blended Learning: A Comparative Analysis of Face-to-Face and Blended Learning Methods

    ERIC Educational Resources Information Center

    Golden, Thomas P.; Karpur, Arun

    2012-01-01

    This study is a comparative analysis of the impact of traditional face-to-face training contrasted with a blended learning approach, as it relates to improving skills, knowledge and attitudes for enhancing practices for achieving improved employment outcomes for individuals with disabilities. The study included two intervention groups: one…

  11. Exploring Intercultural Competence in Teacher Education: A Comparative Study between Science and Foreign Language Teacher Trainers

    ERIC Educational Resources Information Center

    Akpinar, Kadriye Dilek; Ünaldi, Ihsan

    2014-01-01

    This study investigated the intercultural outcomes of short-term study visit programs for Foreign Language and Science teacher trainers. A mixed method including quantitative and qualitative data was used to compare the differences between the two groups' intercultural development in terms of their study field. Fantini's questionnaire was used for…

  12. A Comparative Analysis of Numbers and Biology Content Domains between Turkey and the USA

    ERIC Educational Resources Information Center

    Incikabi, Lutfi; Ozgelen, Sinan; Tjoe, Hartono

    2012-01-01

    This study aimed to compare Mathematics and Science programs focusing on TIMSS content domains of Numbers and Biology that produced the largest achievement gap among students from Turkey and the USA. Specifically, it utilized the content analysis method within Turkish and New York State (NYS) frameworks. The procedures of study included matching…

  13. M. D. Faculty Salaries in Psychiatry and All Clinical Science Departments, 1980-2006

    ERIC Educational Resources Information Center

    Haviland, Mark G.; Dial, Thomas H.; Pincus, Harold Alan

    2009-01-01

    Objective: The authors compare trends in the salaries of physician faculty in academic departments of psychiatry with those of physician faculty in all academic clinical science departments from 1980-2006. Methods: The authors compared trend lines for psychiatry and all faculty by academic rank, including those for department chairs, by graphing…

  14. School Counselors' Job Satisfaction: A Comparative Study of Preschool and Primary-School Counselors in Turkey

    ERIC Educational Resources Information Center

    Nas (Dalçiçek), Esref; Sak, Ramazan; Sahin Sak, Ikbal Tuba

    2017-01-01

    This mixed-methods research compared job satisfaction among counselors working in pre-schools and primary-schools. Its quantitative phase included 223 counselors, 70 of whom also participated in the qualitative phase. A demographic information form, job-satisfaction scale and a semi-structured interview protocol were used to collect data.…

  15. Comparing Coral Reef Survey Methods. Unesco Reports in Marine Science No. 21 Report of a Regional Unesco/UNEP Workshop on Coral Reef Survey Management and Assessment Methods in Asia and the Pacific (Phuket, Thailand, December 13-17, 1982).

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific, and Cultural Organization, Paris (France). Div. of Marine Sciences.

    This report includes nine papers prepared for a workshop on coral reef survey management and assessment methods in Asia and the Pacific. The papers are: "A Contrast in Methodologies between Surveying and Testing" (Charles Birkeland); "Coral Reef Survey Methods in the Andaman Sea" (Hansa Chansang); "A Review of Coral Reef…

  16. Method for reduction of selected ion intensities in confined ion beams

    DOEpatents

    Eiden, Gregory C.; Barinaga, Charles J.; Koppenaal, David W.

    1998-01-01

    A method for producing an ion beam having an increased proportion of analyte ions compared to carrier gas ions is disclosed. Specifically, the method has the step of addition of a charge transfer gas to the carrier analyte combination that accepts charge from the carrier gas ions yet minimally accepts charge from the analyte ions thereby selectively neutralizing the carrier gas ions. Also disclosed is the method as employed in various analytical instruments including an inductively coupled plasma mass spectrometer.

  17. Method for reduction of selected ion intensities in confined ion beams

    DOEpatents

    Eiden, G.C.; Barinaga, C.J.; Koppenaal, D.W.

    1998-06-16

    A method for producing an ion beam having an increased proportion of analyte ions compared to carrier gas ions is disclosed. Specifically, the method has the step of addition of a charge transfer gas to the carrier analyte combination that accepts charge from the carrier gas ions yet minimally accepts charge from the analyte ions thereby selectively neutralizing the carrier gas ions. Also disclosed is the method as employed in various analytical instruments including an inductively coupled plasma mass spectrometer. 7 figs.

  18. Using an Evaluability Assessment To Select Methods for Evaluating State Technology Development Programs: The Case of the Georgia Research Alliance.

    ERIC Educational Resources Information Center

    Youtie, Jan; Bozeman, Barry; Shapira, Philip

    1999-01-01

    Describes an evaluability assessment of the Georgia Research Alliance (GRA), a technology development program. Presents the steps involved in conducting an evaluability assessment, including development of an understanding of the program and its stakeholders. Analyzes and compares different methods by which the GRA could be evaluated. (SLD)

  19. Isospin Breaking Corrections to the HVP with Domain Wall Fermions

    NASA Astrophysics Data System (ADS)

    Boyle, Peter; Guelpers, Vera; Harrison, James; Juettner, Andreas; Lehner, Christoph; Portelli, Antonin; Sachrajda, Christopher

    2018-03-01

    We present results for the QED and strong isospin breaking corrections to the hadronic vacuum polarization using Nf = 2 + 1 Domain Wall fermions. QED is included in an electro-quenched setup using two different methods, a stochastic and a perturbative approach. Results and statistical errors from both methods are directly compared with each other.

  20. Validating Alternative Modes of Scoring for Coloured Progressive Matrices.

    ERIC Educational Resources Information Center

    Razel, Micha; Eylon, Bat-Sheva

    Conventional scoring of the Coloured Progressive Matrices (CPM) was compared with three methods of multiple weight scoring. The methods include: (1) theoretical weighting in which the weights were based on a theory of cognitive processing; (2) judged weighting in which the weights were given by a group of nine adult expert judges; and (3)…

  1. The exact solution of the monoenergetic transport equation for critical cylinders

    NASA Technical Reports Server (NTRS)

    Westfall, R. M.; Metcalf, D. R.

    1972-01-01

    An analytic solution for the critical, monoenergetic, bare, infinite cylinder is presented. The solution is obtained by modifying a previous development based on a neutron density transform and Case's singular eigenfunction method. Numerical results for critical radii and the neutron density as a function of position are included and compared with the results of other methods.

  2. Field efficiency and bias of snag inventory methods

    Treesearch

    Robert S. Kenning; Mark J. Ducey; John C. Brissette; Jeffery H. Gove

    2005-01-01

    Snags and cavity trees are important components of forests, but can be difficult to inventory precisely and are not always included in inventories because of limited resources. We tested the application of N-tree distance sampling as a time-saving snag sampling method and compared N-tree distance sampling to fixed-area sampling and modified horizontal line sampling in...

  3. Clustering Methods; Part IV of Scientific Report No. ISR-18, Information Storage and Retrieval...

    ERIC Educational Resources Information Center

    Cornell Univ., Ithaca, NY. Dept. of Computer Science.

    Two papers are included as Part Four of this report on Salton's Magical Automatic Retriever of Texts (SMART) project report. The first paper: "A Controlled Single Pass Classification Algorithm with Application to Multilevel Clustering" by D. B. Johnson and J. M. Laferente presents a single pass clustering method which compares favorably…

  4. AN EXPERIMENTAL STUDY OF THREE METHODS OF TRAINING INDUSTRIAL EXECUTIVES IN READING IMPROVEMENT (PH.D. THESIS).

    ERIC Educational Resources Information Center

    JONES, DAN H.

    TO COMPARE THREE METHODS OF TRAINING FOR READING IMPROVEMENT, 56 EXECUTIVES OF ONE CORPORATION WERE DIVIDED INTO FOUR GROUPS, EQUATED ACCORDING TO READING RATE, READING COMPREHENSION, READING INDEX, MENTAL ALERTNESS SCORES, AGE, AND VOCABULARY. GROUP A WAS TRAINED WITH THE AID OF ALL AVAILABLE COMMERCIAL EQUIPMENT INCLUDING THE HARVARD FILMS, THE…

  5. Method and apparatus for extraction of low-frequency artifacts from brain waves for alertness detection

    DOEpatents

    Clapp, Ned E.; Hively, Lee M.

    1997-01-01

    Methods and apparatus automatically detect alertness in humans by monitoring and analyzing brain wave signals. Steps include: acquiring the brain wave (EEG or MEG) data from the subject, digitizing the data, separating artifact data from raw data, and comparing trends in f-data to alertness indicators, providing notification of inadequate alertness.

  6. N-person differential games. Part 2: The penalty method

    NASA Technical Reports Server (NTRS)

    Chen, G.; Mills, W. H.; Zheng, Q.; Shaw, W. H.

    1983-01-01

    The equilibrium strategy for N-person differential games can be found by studying a min-max problem subject to differential systems constraints. The differential constraints are penalized and finite elements are used to compute numerical solutions. Convergence proof and error estimates are given. Numerical results are also included and compared with those obtained by the dual method.

  7. Relation between financial market structure and the real economy: comparison between clustering methods.

    PubMed

    Musmeci, Nicoló; Aste, Tomaso; Di Matteo, T

    2015-01-01

    We quantify the amount of information filtered by different hierarchical clustering methods on correlations between stock returns comparing the clustering structure with the underlying industrial activity classification. We apply, for the first time to financial data, a novel hierarchical clustering approach, the Directed Bubble Hierarchical Tree and we compare it with other methods including the Linkage and k-medoids. By taking the industrial sector classification of stocks as a benchmark partition, we evaluate how the different methods retrieve this classification. The results show that the Directed Bubble Hierarchical Tree can outperform other methods, being able to retrieve more information with fewer clusters. Moreover,we show that the economic information is hidden at different levels of the hierarchical structures depending on the clustering method. The dynamical analysis on a rolling window also reveals that the different methods show different degrees of sensitivity to events affecting financial markets, like crises. These results can be of interest for all the applications of clustering methods to portfolio optimization and risk hedging [corrected].

  8. Computerized test versus personal interview as admission methods for graduate nursing studies: A retrospective cohort study.

    PubMed

    Hazut, Koren; Romem, Pnina; Malkin, Smadar; Livshiz-Riven, Ilana

    2016-12-01

    The purpose of this study was to compare the predictive validity, economic efficiency, and faculty staff satisfaction of a computerized test versus a personal interview as admission methods for graduate nursing studies. A mixed method study was designed, including cross-sectional and retrospective cohorts, interviews, and cost analysis. One hundred and thirty-four students in the Master of Nursing program participated. The success of students in required core courses was similar in both admission method groups. The personal interview method was found to be a significant predictor of success, with cognitive variables the only significant contributors to the model. Higher satisfaction levels were reported with the computerized test compared with the personal interview method. The cost of the personal interview method, in annual hourly work, was 2.28 times higher than the computerized test. These findings may promote discussion regarding the cost benefit of the personal interview as an admission method for advanced academic studies in healthcare professions. © 2016 John Wiley & Sons Australia, Ltd.

  9. Comparative Validation of the Determination of Sofosbuvir in Pharmaceuticals by Several Inexpensive Ecofriendly Chromatographic, Electrophoretic, and Spectrophotometric Methods.

    PubMed

    El-Yazbi, Amira F

    2017-07-01

    Sofosbuvir (SOFO) was approved by the U.S. Food and Drug Administration in 2013 for the treatment of hepatitis C virus infection with enhanced antiviral potency compared with earlier analogs. Notwithstanding, all current editions of the pharmacopeias still do not present any analytical methods for the quantification of SOFO. Thus, rapid, simple, and ecofriendly methods for the routine analysis of commercial formulations of SOFO are desirable. In this study, five accurate methods for the determination of SOFO in pharmaceutical tablets were developed and validated. These methods include HPLC, capillary zone electrophoresis, HPTLC, and UV spectrophotometric and derivative spectrometry methods. The proposed methods proved to be rapid, simple, sensitive, selective, and accurate analytical procedures that were suitable for the reliable determination of SOFO in pharmaceutical tablets. An analysis of variance test with P-value > 0.05 confirmed that there were no significant differences between the proposed assays. Thus, any of these methods can be used for the routine analysis of SOFO in commercial tablets.

  10. Relation between Financial Market Structure and the Real Economy: Comparison between Clustering Methods

    PubMed Central

    Musmeci, Nicoló; Aste, Tomaso; Di Matteo, T.

    2015-01-01

    We quantify the amount of information filtered by different hierarchical clustering methods on correlations between stock returns comparing the clustering structure with the underlying industrial activity classification. We apply, for the first time to financial data, a novel hierarchical clustering approach, the Directed Bubble Hierarchical Tree and we compare it with other methods including the Linkage and k-medoids. By taking the industrial sector classification of stocks as a benchmark partition, we evaluate how the different methods retrieve this classification. The results show that the Directed Bubble Hierarchical Tree can outperform other methods, being able to retrieve more information with fewer clusters. Moreover, we show that the economic information is hidden at different levels of the hierarchical structures depending on the clustering method. The dynamical analysis on a rolling window also reveals that the different methods show different degrees of sensitivity to events affecting financial markets, like crises. These results can be of interest for all the applications of clustering methods to portfolio optimization and risk hedging. PMID:25786703

  11. Galaxy two-point covariance matrix estimation for next generation surveys

    NASA Astrophysics Data System (ADS)

    Howlett, Cullan; Percival, Will J.

    2017-12-01

    We perform a detailed analysis of the covariance matrix of the spherically averaged galaxy power spectrum and present a new, practical method for estimating this within an arbitrary survey without the need for running mock galaxy simulations that cover the full survey volume. The method uses theoretical arguments to modify the covariance matrix measured from a set of small-volume cubic galaxy simulations, which are computationally cheap to produce compared to larger simulations and match the measured small-scale galaxy clustering more accurately than is possible using theoretical modelling. We include prescriptions to analytically account for the window function of the survey, which convolves the measured covariance matrix in a non-trivial way. We also present a new method to include the effects of super-sample covariance and modes outside the small simulation volume which requires no additional simulations and still allows us to scale the covariance matrix. As validation, we compare the covariance matrix estimated using our new method to that from a brute-force calculation using 500 simulations originally created for analysis of the Sloan Digital Sky Survey Main Galaxy Sample. We find excellent agreement on all scales of interest for large-scale structure analysis, including those dominated by the effects of the survey window, and on scales where theoretical models of the clustering normally break down, but the new method produces a covariance matrix with significantly better signal-to-noise ratio. Although only formally correct in real space, we also discuss how our method can be extended to incorporate the effects of redshift space distortions.

  12. Timely disclosure of progress in long-term cancer survival: the boomerang method substantially improved estimates in a comparative study.

    PubMed

    Brenner, Hermann; Jansen, Lina

    2016-02-01

    Monitoring cancer survival is a key task of cancer registries, but timely disclosure of progress in long-term survival remains a challenge. We introduce and evaluate a novel method, denoted "boomerang method," for deriving more up-to-date estimates of long-term survival. We applied three established methods (cohort, complete, and period analysis) and the boomerang method to derive up-to-date 10-year relative survival of patients diagnosed with common solid cancers and hematological malignancies in the United States. Using the Surveillance, Epidemiology and End Results 9 database, we compared the most up-to-date age-specific estimates that might have been obtained with the database including patients diagnosed up to 2001 with 10-year survival later observed for patients diagnosed in 1997-2001. For cancers with little or no increase in survival over time, the various estimates of 10-year relative survival potentially available by the end of 2001 were generally rather similar. For malignancies with strongly increasing survival over time, including breast and prostate cancer and all hematological malignancies, the boomerang method provided estimates that were closest to later observed 10-year relative survival in 23 of the 34 groups assessed. The boomerang method can substantially improve up-to-dateness of long-term cancer survival estimates in times of ongoing improvement in prognosis. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. The Impact of the Implementation of Edge Detection Methods on the Accuracy of Automatic Voltage Reading

    NASA Astrophysics Data System (ADS)

    Sidor, Kamil; Szlachta, Anna

    2017-04-01

    The article presents the impact of the edge detection method in the image analysis on the reading accuracy of the measured value. In order to ensure the automatic reading of the measured value by an analog meter, a standard webcam and the LabVIEW programme were applied. NI Vision Development tools were used. The Hough transform was used to detect the indicator. The programme output was compared during the application of several methods of edge detection. Those included: the Prewitt operator, the Roberts cross, the Sobel operator and the Canny edge detector. The image analysis was made for an analog meter indicator with the above-mentioned methods, and the results of that analysis were compared with each other and presented.

  14. An overview of methods for comparative effectiveness research.

    PubMed

    Meyer, Anne-Marie; Wheeler, Stephanie B; Weinberger, Morris; Chen, Ronald C; Carpenter, William R

    2014-01-01

    Comparative effectiveness research (CER) is a broad category of outcomes research encompassing many different methods employed by researchers and clinicians from numerous disciplines. The goal of cancer-focused CER is to generate new knowledge to assist cancer stakeholders in making informed decisions that will improve health care and outcomes of both individuals and populations. There are numerous CER methods that may be used to examine specific questions, including randomized controlled trials, observational studies, systematic literature reviews, and decision sciences modeling. Each has its strengths and weaknesses. To both inform and serve as a reference for readers of this issue of Seminars in Radiation Oncology as well as the broader oncology community, we describe CER and several of the more commonly used approaches and analytical methods. © 2013 Published by Elsevier Inc.

  15. Kaplan-Meier Survival Analysis Overestimates the Risk of Revision Arthroplasty: A Meta-analysis.

    PubMed

    Lacny, Sarah; Wilson, Todd; Clement, Fiona; Roberts, Derek J; Faris, Peter D; Ghali, William A; Marshall, Deborah A

    2015-11-01

    Although Kaplan-Meier survival analysis is commonly used to estimate the cumulative incidence of revision after joint arthroplasty, it theoretically overestimates the risk of revision in the presence of competing risks (such as death). Because the magnitude of overestimation is not well documented, the potential associated impact on clinical and policy decision-making remains unknown. We performed a meta-analysis to answer the following questions: (1) To what extent does the Kaplan-Meier method overestimate the cumulative incidence of revision after joint replacement compared with alternative competing-risks methods? (2) Is the extent of overestimation influenced by followup time or rate of competing risks? We searched Ovid MEDLINE, EMBASE, BIOSIS Previews, and Web of Science (1946, 1980, 1980, and 1899, respectively, to October 26, 2013) and included article bibliographies for studies comparing estimated cumulative incidence of revision after hip or knee arthroplasty obtained using both Kaplan-Meier and competing-risks methods. We excluded conference abstracts, unpublished studies, or studies using simulated data sets. Two reviewers independently extracted data and evaluated the quality of reporting of the included studies. Among 1160 abstracts identified, six studies were included in our meta-analysis. The principal reason for the steep attrition (1160 to six) was that the initial search was for studies in any clinical area that compared the cumulative incidence estimated using the Kaplan-Meier versus competing-risks methods for any event (not just the cumulative incidence of hip or knee revision); we did this to minimize the likelihood of missing any relevant studies. We calculated risk ratios (RRs) comparing the cumulative incidence estimated using the Kaplan-Meier method with the competing-risks method for each study and used DerSimonian and Laird random effects models to pool these RRs. Heterogeneity was explored using stratified meta-analyses and metaregression. The pooled cumulative incidence of revision after hip or knee arthroplasty obtained using the Kaplan-Meier method was 1.55 times higher (95% confidence interval, 1.43-1.68; p < 0.001) than that obtained using the competing-risks method. Longer followup times and higher proportions of competing risks were not associated with increases in the amount of overestimation of revision risk by the Kaplan-Meier method (all p > 0.10). This may be due to the small number of studies that met the inclusion criteria and conservative variance approximation. The Kaplan-Meier method overestimates risk of revision after hip or knee arthroplasty in populations where competing risks (such as death) might preclude the occurrence of the event of interest (revision). Competing-risks methods should be used to more accurately estimate the cumulative incidence of revision when the goal is to plan healthcare services and resource allocation for revisions.

  16. The experiments and analysis of several selective video encryption methods

    NASA Astrophysics Data System (ADS)

    Zhang, Yue; Yang, Cheng; Wang, Lei

    2013-07-01

    This paper presents four methods for selective video encryption based on the MPEG-2 video compression,including the slices, the I-frames, the motion vectors, and the DCT coefficients. We use the AES encryption method for simulation experiment for the four methods on VS2010 Platform, and compare the video effects and the processing speed of each frame after the video encrypted. The encryption depth can be arbitrarily selected, and design the encryption depth by using the double limit counting method, so the accuracy can be increased.

  17. Assessment Methods of Groundwater Overdraft Area and Its Application

    NASA Astrophysics Data System (ADS)

    Dong, Yanan; Xing, Liting; Zhang, Xinhui; Cao, Qianqian; Lan, Xiaoxun

    2018-05-01

    Groundwater is an important source of water, and long-term large demand make groundwater over-exploited. Over-exploitation cause a lot of environmental and geological problems. This paper explores the concept of over-exploitation area, summarizes the natural and social attributes of over-exploitation area, as well as expounds its evaluation methods, including single factor evaluation, multi-factor system analysis and numerical method. At the same time, the different methods are compared and analyzed. And then taking Northern Weifang as an example, this paper introduces the practicality of appraisal method.

  18. Dynamic baseline detection method for power data network service

    NASA Astrophysics Data System (ADS)

    Chen, Wei

    2017-08-01

    This paper proposes a dynamic baseline Traffic detection Method which is based on the historical traffic data for the Power data network. The method uses Cisco's NetFlow acquisition tool to collect the original historical traffic data from network element at fixed intervals. This method uses three dimensions information including the communication port, time, traffic (number of bytes or number of packets) t. By filtering, removing the deviation value, calculating the dynamic baseline value, comparing the actual value with the baseline value, the method can detect whether the current network traffic is abnormal.

  19. Method for determining formation quality factor from well log data and its application to seismic reservoir characterization

    DOEpatents

    Walls, Joel; Taner, M. Turhan; Dvorkin, Jack

    2006-08-08

    A method for seismic characterization of subsurface Earth formations includes determining at least one of compressional velocity and shear velocity, and determining reservoir parameters of subsurface Earth formations, at least including density, from data obtained from a wellbore penetrating the formations. A quality factor for the subsurface formations is calculated from the velocity, the density and the water saturation. A synthetic seismogram is calculated from the calculated quality factor and from the velocity and density. The synthetic seismogram is compared to a seismic survey made in the vicinity of the wellbore. At least one parameter is adjusted. The synthetic seismogram is recalculated using the adjusted parameter, and the adjusting, recalculating and comparing are repeated until a difference between the synthetic seismogram and the seismic survey falls below a selected threshold.

  20. Microbiological methods for the water recovery systems test, revision 1.1

    NASA Technical Reports Server (NTRS)

    Rhoads, Tim; Kilgore, M. V., Jr.; Mikell, A. T., Jr.

    1990-01-01

    Current microbiological parameters specified to verify microbiological quality of Space Station Freedom water quality include the enumeration of total bacteria, anaerobes, aerobes, yeasts and molds, enteric bacteria, gram positives, gram negatives, and E. coli. In addition, other parameters have been identified as necessary to support the Water Recovery Test activities to be conducted at the NASA/MSFC later this year. These other parameters include aerotolerant eutrophic mesophiles, legionellae, and an additional method for heterotrophic bacteria. If inter-laboratory data are to be compared to evaluate quality, analytical methods must be eliminated as a variable. Therefore, each participating laboratory must utilize the same analytical methods and procedures. Without this standardization, data can be neither compared nor validated between laboratories. Multiple laboratory participation represents a conservative approach to insure quality and completeness of data. Invariably, sample loss will occur in transport and analyses. Natural variance is a reality on any test of this magnitude and is further enhanced because biological entities, capable of growth and death, are specific parameters of interest. The large variation due to the participation of human test subjects has been noted with previous testing. The resultant data might be dismissed as 'out of control' unless intra-laboratory control is included as part of the method or if participating laboratories are not available for verification. The purpose of this document is to provide standardized laboratory procedures for the enumeration of certain microorganisms in water and wastewater specific to the water recovery systems test. The document consists of ten separate cultural methods and one direct count procedure. It is not intended nor is it implied to be a complete microbiological methods manual.

  1. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more.

    PubMed

    Rivas, Elena; Lang, Raymond; Eddy, Sean R

    2012-02-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.

  2. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more

    PubMed Central

    Rivas, Elena; Lang, Raymond; Eddy, Sean R.

    2012-01-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases. PMID:22194308

  3. Method for indexing and retrieving manufacturing-specific digital imagery based on image content

    DOEpatents

    Ferrell, Regina K.; Karnowski, Thomas P.; Tobin, Jr., Kenneth W.

    2004-06-15

    A method for indexing and retrieving manufacturing-specific digital images based on image content comprises three steps. First, at least one feature vector can be extracted from a manufacturing-specific digital image stored in an image database. In particular, each extracted feature vector corresponds to a particular characteristic of the manufacturing-specific digital image, for instance, a digital image modality and overall characteristic, a substrate/background characteristic, and an anomaly/defect characteristic. Notably, the extracting step includes generating a defect mask using a detection process. Second, using an unsupervised clustering method, each extracted feature vector can be indexed in a hierarchical search tree. Third, a manufacturing-specific digital image associated with a feature vector stored in the hierarchicial search tree can be retrieved, wherein the manufacturing-specific digital image has image content comparably related to the image content of the query image. More particularly, can include two data reductions, the first performed based upon a query vector extracted from a query image. Subsequently, a user can select relevant images resulting from the first data reduction. From the selection, a prototype vector can be calculated, from which a second-level data reduction can be performed. The second-level data reduction can result in a subset of feature vectors comparable to the prototype vector, and further comparable to the query vector. An additional fourth step can include managing the hierarchical search tree by substituting a vector average for several redundant feature vectors encapsulated by nodes in the hierarchical search tree.

  4. Method of determining a content of a nuclear waste container

    DOEpatents

    Bernardi, Richard T.; Entwistle, David

    2003-04-22

    A method and apparatus are provided for identifying contents of a nuclear waste container. The method includes the steps of forming an image of the contents of the container using digital radiography, visually comparing contents of the image with expected contents of the container and performing computer tomography on the container when the visual inspection reveals an inconsistency between the contents of the image and the expected contents of the container.

  5. Fast adaptive composite grid methods on distributed parallel architectures

    NASA Technical Reports Server (NTRS)

    Lemke, Max; Quinlan, Daniel

    1992-01-01

    The fast adaptive composite (FAC) grid method is compared with the adaptive composite method (AFAC) under variety of conditions including vectorization and parallelization. Results are given for distributed memory multiprocessor architectures (SUPRENUM, Intel iPSC/2 and iPSC/860). It is shown that the good performance of AFAC and its superiority over FAC in a parallel environment is a property of the algorithm and not dependent on peculiarities of any machine.

  6. A new clocking method for a charge coupled device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Umezu, Rika; Kitamoto, Shunji, E-mail: kitamoto@rikkyo.ac.jp; Murakami, Hiroshi

    2014-07-15

    We propose and demonstrate a new clocking method for a charge-coupled device (CCD). When a CCD is used for a photon counting detector of X-rays, its weak point is a limitation of its counting rate, because high counting rate makes non-negligible pile-up of photons. In astronomical usage, this pile-up is especially severe for an observation of a bright point-like object. One typical idea to reduce the pile-up is a parallel sum (P-sum) mode. This mode completely loses one-dimensional information. Our new clocking method, panning mode, provides complementary properties between the normal mode and the P-sum mode. We performed a simplemore » simulation in order to investigate a pile-up probability and compared the simulated result and actual obtained event rates. Using this simulation and the experimental results, we compared the pile-up tolerance of various clocking modes including our new method and also compared their other characteristics.« less

  7. Pressure measurements in a low-density nozzle plume for code verification

    NASA Technical Reports Server (NTRS)

    Penko, Paul F.; Boyd, Iain D.; Meissner, Dana L.; Dewitt, Kenneth J.

    1991-01-01

    Measurements of Pitot pressure were made in the exit plane and plume of a low-density, nitrogen nozzle flow. Two numerical computer codes were used to analyze the flow, including one based on continuum theory using the explicit MacCormack method, and the other on kinetic theory using the method of direct-simulation Monte Carlo (DSMC). The continuum analysis was carried to the nozzle exit plane and the results were compared to the measurements. The DSMC analysis was extended into the plume of the nozzle flow and the results were compared with measurements at the exit plane and axial stations 12, 24 and 36 mm into the near-field plume. Two experimental apparatus were used that differed in design and gave slightly different profiles of pressure measurements. The DSMC method compared well with the measurements from each apparatus at all axial stations and provided a more accurate prediction of the flow than the continuum method, verifying the validity of DSMC for such calculations.

  8. Costs and Efficiency of Online and Offline Recruitment Methods: A Web-Based Cohort Study

    PubMed Central

    Riis, Anders H; Hatch, Elizabeth E; Wise, Lauren A; Nielsen, Marie G; Rothman, Kenneth J; Toft Sørensen, Henrik; Mikkelsen, Ellen M

    2017-01-01

    Background The Internet is widely used to conduct research studies on health issues. Many different methods are used to recruit participants for such studies, but little is known about how various recruitment methods compare in terms of efficiency and costs. Objective The aim of our study was to compare online and offline recruitment methods for Internet-based studies in terms of efficiency (number of recruited participants) and costs per participant. Methods We employed several online and offline recruitment methods to enroll 18- to 45-year-old women in an Internet-based Danish prospective cohort study on fertility. Offline methods included press releases, posters, and flyers. Online methods comprised advertisements placed on five different websites, including Facebook and Netdoktor.dk. We defined seven categories of mutually exclusive recruitment methods and used electronic tracking via unique Uniform Resource Locator (URL) and self-reported data to identify the recruitment method for each participant. For each method, we calculated the average cost per participant and efficiency, that is, the total number of recruited participants. Results We recruited 8252 study participants. Of these, 534 were excluded as they could not be assigned to a specific recruitment method. The final study population included 7724 participants, of whom 803 (10.4%) were recruited by offline methods, 3985 (51.6%) by online methods, 2382 (30.8%) by online methods not initiated by us, and 554 (7.2%) by other methods. Overall, the average cost per participant was €6.22 for online methods initiated by us versus €9.06 for offline methods. Costs per participant ranged from €2.74 to €105.53 for online methods and from €0 to €67.50 for offline methods. Lowest average costs per participant were for those recruited from Netdoktor.dk (€2.99) and from Facebook (€3.44). Conclusions In our Internet-based cohort study, online recruitment methods were superior to offline methods in terms of efficiency (total number of participants enrolled). The average cost per recruited participant was also lower for online than for offline methods, although costs varied greatly among both online and offline recruitment methods. We observed a decrease in the efficiency of some online recruitment methods over time, suggesting that it may be optimal to adopt multiple online methods. PMID:28249833

  9. Antimicrobial susceptibility testing of Mycobacterium tuberculosis complex for first and second line drugs by broth dilution in a microtiter plate format.

    PubMed

    Hall, Leslie; Jude, Kurt P; Clark, Shirley L; Wengenack, Nancy L

    2011-06-24

    The rapid detection of antimicrobial resistance is important in the effort to control the increase in resistant Mycobacterium tuberculosis (Mtb). Antimicrobial susceptibility testing (AST) of Mtb has traditionally been performed by the agar method of proportion or by macrobroth testing on an instrument such as the BACTEC (Becton Dickinson, Sparks, MD), VersaTREK (TREK Diagnostics, Cleveland, OH) or BacT/ALERT (bioMérieux, Hazelwood, MO). The agar proportion method, while considered the "gold" standard of AST, is labor intensive and requires calculation of resistance by performing colony counts on drug-containing agar as compared to drug-free agar. If there is ≥1% growth on the drug-containing medium as compared to drug-free medium, the organism is considered resistant to that drug. The macrobroth methods require instrumentation and test break point ("critical") drug concentrations for the first line drugs (isoniazid, ethambutol, rifampin, and pyrazinamide). The method described here is commercially available in a 96 well microtiter plate format [MYCOTB (TREK Diagnostics)] and contains increasing concentrations of 12 antimicrobials used for treatment of tuberculosis including both first (isoniazid, rifampin, ethambutol) and second line drugs (amikacin, cycloserine, ethionamide, kanamycin, moxifloxacin, ofloxacin, para-aminosalicylic acid, rifabutin, and streptomycin). Pyrazinamide, a first line drug, is not included in the microtiter plate due to its need for acidic test conditions. Advantages of the microtiter system include both ease of set up and faster turn around time (14 days) compared with traditional agar proportion (21 days). In addition, the plate can be set up from inoculum prepared using either broth or solid medium. Since the microtiter plate format is new and since Mtb presents unique safety challenges in the laboratory, this protocol will describe how to safely setup, incubate and read the microtiter plate.

  10. Flow-gated radial phase-contrast imaging in the presence of weak flow.

    PubMed

    Peng, Hsu-Hsia; Huang, Teng-Yi; Wang, Fu-Nien; Chung, Hsiao-Wen

    2013-01-01

    To implement a flow-gating method to acquire phase-contrast (PC) images of carotid arteries without use of an electrocardiography (ECG) signal to synchronize the acquisition of imaging data with pulsatile arterial flow. The flow-gating method was realized through radial scanning and sophisticated post-processing methods including downsampling, complex difference, and correlation analysis to improve the evaluation of flow-gating times in radial phase-contrast scans. Quantitatively comparable results (R = 0.92-0.96, n = 9) of flow-related parameters, including mean velocity, mean flow rate, and flow volume, with conventional ECG-gated imaging demonstrated that the proposed method is highly feasible. The radial flow-gating PC imaging method is applicable in carotid arteries. The proposed flow-gating method can potentially avoid the setting up of ECG-related equipment for brain imaging. This technique has potential use in patients with arrhythmia or weak ECG signals.

  11. Comparison of some optimal control methods for the design of turbine blades

    NASA Technical Reports Server (NTRS)

    Desilva, B. M. E.; Grant, G. N. C.

    1977-01-01

    This paper attempts a comparative study of some numerical methods for the optimal control design of turbine blades whose vibration characteristics are approximated by Timoshenko beam idealizations with shear and incorporating simple boundary conditions. The blade was synthesized using the following methods: (1) conjugate gradient minimization of the system Hamiltonian in function space incorporating penalty function transformations, (2) projection operator methods in a function space which includes the frequencies of vibration and the control function, (3) epsilon-technique penalty function transformation resulting in a highly nonlinear programming problem, (4) finite difference discretization of the state equations again resulting in a nonlinear program, (5) second variation methods with complex state differential equations to include damping effects resulting in systems of inhomogeneous matrix Riccatti equations some of which are stiff, (6) quasi-linear methods based on iterative linearization of the state and adjoint equation. The paper includes a discussion of some substantial computational difficulties encountered in the implementation of these techniques together with a resume of work presently in progress using a differential dynamic programming approach.

  12. Constrained CVT meshes and a comparison of triangular mesh generators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Hoa; Burkardt, John; Gunzburger, Max

    2009-01-01

    Mesh generation in regions in Euclidean space is a central task in computational science, and especially for commonly used numerical methods for the solution of partial differential equations, e.g., finite element and finite volume methods. We focus on the uniform Delaunay triangulation of planar regions and, in particular, on how one selects the positions of the vertices of the triangulation. We discuss a recently developed method, based on the centroidal Voronoi tessellation (CVT) concept, for effecting such triangulations and present two algorithms, including one new one, for CVT-based grid generation. We also compare several methods, including CVT-based methods, for triangulatingmore » planar domains. To this end, we define several quantitative measures of the quality of uniform grids. We then generate triangulations of several planar regions, including some having complexities that are representative of what one may encounter in practice. We subject the resulting grids to visual and quantitative comparisons and conclude that all the methods considered produce high-quality uniform grids and that the CVT-based grids are at least as good as any of the others.« less

  13. Linnorm: improved statistical analysis for single cell RNA-seq expression data.

    PubMed

    Yip, Shun H; Wang, Panwen; Kocher, Jean-Pierre A; Sham, Pak Chung; Wang, Junwen

    2017-12-15

    Linnorm is a novel normalization and transformation method for the analysis of single cell RNA sequencing (scRNA-seq) data. Linnorm is developed to remove technical noises and simultaneously preserve biological variations in scRNA-seq data, such that existing statistical methods can be improved. Using real scRNA-seq data, we compared Linnorm with existing normalization methods, including NODES, SAMstrt, SCnorm, scran, DESeq and TMM. Linnorm shows advantages in speed, technical noise removal and preservation of cell heterogeneity, which can improve existing methods in the discovery of novel subtypes, pseudo-temporal ordering of cells, clustering analysis, etc. Linnorm also performs better than existing DEG analysis methods, including BASiCS, NODES, SAMstrt, Seurat and DESeq2, in false positive rate control and accuracy. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Frequency of data extraction errors and methods to increase data extraction quality: a methodological review.

    PubMed

    Mathes, Tim; Klaßen, Pauline; Pieper, Dawid

    2017-11-28

    Our objective was to assess the frequency of data extraction errors and its potential impact on results in systematic reviews. Furthermore, we evaluated the effect of different extraction methods, reviewer characteristics and reviewer training on error rates and results. We performed a systematic review of methodological literature in PubMed, Cochrane methodological registry, and by manual searches (12/2016). Studies were selected by two reviewers independently. Data were extracted in standardized tables by one reviewer and verified by a second. The analysis included six studies; four studies on extraction error frequency, one study comparing different reviewer extraction methods and two studies comparing different reviewer characteristics. We did not find a study on reviewer training. There was a high rate of extraction errors (up to 50%). Errors often had an influence on effect estimates. Different data extraction methods and reviewer characteristics had moderate effect on extraction error rates and effect estimates. The evidence base for established standards of data extraction seems weak despite the high prevalence of extraction errors. More comparative studies are needed to get deeper insights into the influence of different extraction methods.

  15. Rapid comparison of properties on protein surface

    PubMed Central

    Sael, Lee; La, David; Li, Bin; Rustamov, Raif; Kihara, Daisuke

    2008-01-01

    The mapping of physicochemical characteristics onto the surface of a protein provides crucial insights into its function and evolution. This information can be further used in the characterization and identification of similarities within protein surface regions. We propose a novel method which quantitatively compares global and local properties on the protein surface. We have tested the method on comparison of electrostatic potentials and hydrophobicity. The method is based on 3D Zernike descriptors, which provides a compact representation of a given property defined on a protein surface. Compactness and rotational invariance of this descriptor enable fast comparison suitable for database searches. The usefulness of this method is exemplified by studying several protein families including globins, thermophilic and mesophilic proteins, and active sites of TIM β/α barrel proteins. In all the cases studied, the descriptor is able to cluster proteins into functionally relevant groups. The proposed approach can also be easily extended to other surface properties. This protein surface-based approach will add a new way of viewing and comparing proteins to conventional methods, which compare proteins in terms of their primary sequence or tertiary structure. PMID:18618695

  16. Rapid comparison of properties on protein surface.

    PubMed

    Sael, Lee; La, David; Li, Bin; Rustamov, Raif; Kihara, Daisuke

    2008-10-01

    The mapping of physicochemical characteristics onto the surface of a protein provides crucial insights into its function and evolution. This information can be further used in the characterization and identification of similarities within protein surface regions. We propose a novel method which quantitatively compares global and local properties on the protein surface. We have tested the method on comparison of electrostatic potentials and hydrophobicity. The method is based on 3D Zernike descriptors, which provides a compact representation of a given property defined on a protein surface. Compactness and rotational invariance of this descriptor enable fast comparison suitable for database searches. The usefulness of this method is exemplified by studying several protein families including globins, thermophilic and mesophilic proteins, and active sites of TIM beta/alpha barrel proteins. In all the cases studied, the descriptor is able to cluster proteins into functionally relevant groups. The proposed approach can also be easily extended to other surface properties. This protein surface-based approach will add a new way of viewing and comparing proteins to conventional methods, which compare proteins in terms of their primary sequence or tertiary structure.

  17. Temporal similarity perfusion mapping: A standardized and model-free method for detecting perfusion deficits in stroke

    PubMed Central

    Song, Sunbin; Luby, Marie; Edwardson, Matthew A.; Brown, Tyler; Shah, Shreyansh; Cox, Robert W.; Saad, Ziad S.; Reynolds, Richard C.; Glen, Daniel R.; Cohen, Leonardo G.; Latour, Lawrence L.

    2017-01-01

    Introduction Interpretation of the extent of perfusion deficits in stroke MRI is highly dependent on the method used for analyzing the perfusion-weighted signal intensity time-series after gadolinium injection. In this study, we introduce a new model-free standardized method of temporal similarity perfusion (TSP) mapping for perfusion deficit detection and test its ability and reliability in acute ischemia. Materials and methods Forty patients with an ischemic stroke or transient ischemic attack were included. Two blinded readers compared real-time generated interactive maps and automatically generated TSP maps to traditional TTP/MTT maps for presence of perfusion deficits. Lesion volumes were compared for volumetric inter-rater reliability, spatial concordance between perfusion deficits and healthy tissue and contrast-to-noise ratio (CNR). Results Perfusion deficits were correctly detected in all patients with acute ischemia. Inter-rater reliability was higher for TSP when compared to TTP/MTT maps and there was a high similarity between the lesion volumes depicted on TSP and TTP/MTT (r(18) = 0.73). The Pearson's correlation between lesions calculated on TSP and traditional maps was high (r(18) = 0.73, p<0.0003), however the effective CNR was greater for TSP compared to TTP (352.3 vs 283.5, t(19) = 2.6, p<0.03.) and MTT (228.3, t(19) = 2.8, p<0.03). Discussion TSP maps provide a reliable and robust model-free method for accurate perfusion deficit detection and improve lesion delineation compared to traditional methods. This simple method is also computationally faster and more easily automated than model-based methods. This method can potentially improve the speed and accuracy in perfusion deficit detection for acute stroke treatment and clinical trial inclusion decision-making. PMID:28973000

  18. Theoretical study of the electric dipole moment function of the ClO molecule

    NASA Technical Reports Server (NTRS)

    Pettersson, L. G. M.; Langhoff, S. R.; Chong, D. P.

    1986-01-01

    The potential energy function and electric dipole moment function (EDMF) are computed for ClO X 2Pi using several different techniques to include electron correlation. The EDMF is used to compute Einstein coefficients, vibrational lifetimes, and dipole moments in higher vibrational levels. The band strength of the 1-0 fundamental transition is computed to be 12 + or - 2 per sq cm atm determined from infrared heterodyne spectroscopy. The theoretical methods used include SCF, CASSCF, multireference singles plus doubles configuration interaction (MRCI) and contracted CI, coupled pair functional (CPF), and a modified version of the CPF method. The results obtained using the different methods are critically compared.

  19. Practical guide to understanding Comparative Effectiveness Research (CER).

    PubMed

    Neely, J Gail; Sharon, Jeffrey D; Graboyes, Evan M; Paniello, Randal C; Nussenbaum, Brian; Grindler, David J; Dassopoulos, Themistocles

    2013-12-01

    "Comparative effectiveness research" (CER) is not a new concept; however, recently it has been popularized as a method to develop scientifically sound actionable data by which patients, physicians, payers, and policymakers may make informed health care decisions. Fundamental to CER is that the comparative data are derived from large diverse populations of patients assembled from point-of-care general primary care practices and that measured outcomes include patient value judgments. The challenge is to obtain scientifically valid data to be acted upon by decision-making stakeholders with potentially quite diversely different agenda. The process requires very thoughtful research designs modulated by complex statistical and analytic methods. This article is composed of a guiding narrative with an extensive set of tables outlining many of the details required in performing and understanding CER. It ends with short discussions of three example papers, limitations of the method, and how a practicing physician may view such reports.

  20. PET Timing Performance Measurement Method Using NEMA NEC Phantom

    NASA Astrophysics Data System (ADS)

    Wang, Gin-Chung; Li, Xiaoli; Niu, Xiaofeng; Du, Huini; Balakrishnan, Karthik; Ye, Hongwei; Burr, Kent

    2016-06-01

    When comparing the performance of time-of-flight whole-body PET scanners, timing resolution is one important benchmark. Timing performance is heavily influenced by detector and electronics design. Even for the same scanner design, measured timing resolution is a function of many factors including the activity concentration, geometry and positioning of the radioactive source. Due to lack of measurement standards, the timing resolutions reported in the literature may not be directly comparable and may not describe the timing performance under clinically relevant conditions. In this work we introduce a method which makes use of the data acquired during the standard NEMA Noise-Equivalent-Count-Rate (NECR) measurements, and compare it to several other timing resolution measurement methods. The use of the NEMA NEC phantom, with well-defined dimensions and radioactivity distribution, is attractive because it has been widely accepted in the industry and allows for the characterization of timing resolution across a more relevant range of conditions.

  1. Comparison of 3D quantitative structure-activity relationship methods: Analysis of the in vitro antimalarial activity of 154 artemisinin analogues by hypothetical active-site lattice and comparative molecular field analysis

    NASA Astrophysics Data System (ADS)

    Woolfrey, John R.; Avery, Mitchell A.; Doweyko, Arthur M.

    1998-03-01

    Two three-dimensional quantitative structure-activity relationship (3D-QSAR) methods, comparative molecular field analysis (CoMFA) and hypothetical active site lattice (HASL), were compared with respect to the analysis of a training set of 154 artemisinin analogues. Five models were created, including a complete HASL and two trimmed versions, as well as two CoMFA models (leave-one-out standard CoMFA and the guided-region selection protocol). Similar r2 and q2 values were obtained by each method, although some striking differences existed between CoMFA contour maps and the HASL output. Each of the four predictive models exhibited a similar ability to predict the activity of a test set of 23 artemisinin analogues, although some differences were noted as to which compounds were described well by either model.

  2. Technology-based vs. traditional instruction. A comparison of two methods for teaching the skill of performing a 12-lead ECG.

    PubMed

    Jeffries, Pamela R; Woolf, Shirley; Linde, Beverly

    2003-01-01

    The purpose of this study was to compare the effectiveness of an interactive, multimedia CD-ROM with traditional methods of teaching the skill of performing a 12-lead ECG. A randomized pre/posttest experimental design was used. Seventy-seven baccalaureate nursing students in a required, senior-level critical-care course at a large midwestern university were recruited for the study. Two teaching methods were compared. The traditional method included a self-study module, a brief lecture and demonstration by an instructor, and hands-on experience using a plastic manikin and a real 12-lead ECG machine in the learning laboratory. The second method covered the same content using an interactive, multimedia CD-ROM embedded with virtual reality and supplemented with a self-study module. There were no significant (p < .05) baseline differences in pretest scores between the two groups and no significant differences by group in cognitive gains, student satisfaction with their learning method, or perception of self-efficacy in performing the skill. Overall results indicated that both groups were satisfied with their instructional method and were similar in their ability to demonstrate the skill correctly on a live, simulated patient. This evaluation study is a beginning step to assess new and potentially more cost-effective teaching methods and their effects on student learning outcomes and behaviors, including the transfer of skill acquisition via a computer simulation to a real patient.

  3. Long-Term Follow-Up on the Donor Foot After Thumb Reconstruction Using Big Toe Wrap-Around Flap in Two Different Operation Methods.

    PubMed

    Ma, Zhi-Guo; Guo, Yong-Jun; Yan, Hou-Jun; Li, Qi-Ming; Ma, Bin

    2017-02-01

    The function of the donor foot has been affected after using big toe wrap-around flap for thumb reconstruction. A modified operation method has been developed to reduce the adverse effect on the donor foot. The current study compared the long-term effect of the classic and the modified operation methods on the donor foot. Gait analysis was carried out, including how the patient walked, the walking speed and walking distance, and how the patient jumped and ran. Plantar pressure was measured while the patient was standing and moving. A total of 45 patients who received the 2 different operation methods were included. The follow-up time was 4-10 years with a mean of 6.5 years. Various degrees of complications occurred for the 21 patients who received the classic operation method. For these patients, plantar pressure of the donor foot was obviously different comparing with the healthy unaffected foot while the patient was standing or walking. For the 24 patients who received the modified operation method, no obvious complications were observed and the plantar pressure of the donor foot and the healthy unaffected foot was similar while the patient was standing or walking. In conclusion, both the classic and the modified operation methods have affected the function of the donor foot after using the big toe wrap-around flap for thumb reconstruction. However, the donor foot was less affected when the modified operation method was used.

  4. A Review on Human Activity Recognition Using Vision-Based Method.

    PubMed

    Zhang, Shugang; Wei, Zhiqiang; Nie, Jie; Huang, Lei; Wang, Shuang; Li, Zhen

    2017-01-01

    Human activity recognition (HAR) aims to recognize activities from a series of observations on the actions of subjects and the environmental conditions. The vision-based HAR research is the basis of many applications including video surveillance, health care, and human-computer interaction (HCI). This review highlights the advances of state-of-the-art activity recognition approaches, especially for the activity representation and classification methods. For the representation methods, we sort out a chronological research trajectory from global representations to local representations, and recent depth-based representations. For the classification methods, we conform to the categorization of template-based methods, discriminative models, and generative models and review several prevalent methods. Next, representative and available datasets are introduced. Aiming to provide an overview of those methods and a convenient way of comparing them, we classify existing literatures with a detailed taxonomy including representation and classification methods, as well as the datasets they used. Finally, we investigate the directions for future research.

  5. A Review on Human Activity Recognition Using Vision-Based Method

    PubMed Central

    Nie, Jie

    2017-01-01

    Human activity recognition (HAR) aims to recognize activities from a series of observations on the actions of subjects and the environmental conditions. The vision-based HAR research is the basis of many applications including video surveillance, health care, and human-computer interaction (HCI). This review highlights the advances of state-of-the-art activity recognition approaches, especially for the activity representation and classification methods. For the representation methods, we sort out a chronological research trajectory from global representations to local representations, and recent depth-based representations. For the classification methods, we conform to the categorization of template-based methods, discriminative models, and generative models and review several prevalent methods. Next, representative and available datasets are introduced. Aiming to provide an overview of those methods and a convenient way of comparing them, we classify existing literatures with a detailed taxonomy including representation and classification methods, as well as the datasets they used. Finally, we investigate the directions for future research. PMID:29065585

  6. Self-balanced modulation and magnetic rebalancing method for parallel multilevel inverters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Hui; Shi, Yanjun

    A self-balanced modulation method and a closed-loop magnetic flux rebalancing control method for parallel multilevel inverters. The combination of the two methods provides for balancing of the magnetic flux of the inter-cell transformers (ICTs) of the parallel multilevel inverters without deteriorating the quality of the output voltage. In various embodiments a parallel multi-level inverter modulator is provide including a multi-channel comparator to generate a multiplexed digitized ideal waveform for a parallel multi-level inverter and a finite state machine (FSM) module coupled to the parallel multi-channel comparator, the FSM module to receive the multiplexed digitized ideal waveform and to generate amore » pulse width modulated gate-drive signal for each switching device of the parallel multi-level inverter. The system and method provides for optimization of the output voltage spectrum without influence the magnetic balancing.« less

  7. Development of sampling method and chromatographic analysis of volatile organic compounds emitted from human skin.

    PubMed

    Grabowska-Polanowska, Beata; Miarka, Przemysław; Skowron, Monika; Sułowicz, Joanna; Wojtyna, Katarzyna; Moskal, Karolina; Śliwka, Ireneusz

    2017-10-01

    The studies on volatile organic compounds emitted from skin are an interest for chemists, biologists and physicians due to their role in development of different scientific areas, including medical diagnostics, forensic medicine and the perfume design. This paper presents a proposal of two sampling methods applied to skin odor collection: the first one uses a bag of cellulose film, the second one, using cellulose sachets filled with active carbon. Volatile organic compounds were adsorbed on carbon sorbent, removed via thermal desorption and analyzed using gas chromatograph with mass spectrometer. The first sampling method allowed identification of more compounds (52) comparing to the second one (30). Quantitative analyses for acetone, butanal, pentanal and hexanal were done. The skin odor sampling method using a bag of cellulose film, allowed the identification of many more compounds when compared with the method using a sachet filled with active carbon.

  8. Numerical simulation for the air entrainment of aerated flow with an improved multiphase SPH model

    NASA Astrophysics Data System (ADS)

    Wan, Hang; Li, Ran; Pu, Xunchi; Zhang, Hongwei; Feng, Jingjie

    2017-11-01

    Aerated flow is a complex hydraulic phenomenon that exists widely in the field of environmental hydraulics. It is generally characterised by large deformation and violent fragmentation of the free surface. Compared to Euler methods (volume of fluid (VOF) method or rigid-lid hypothesis method), the existing single-phase Smooth Particle Hydrodynamics (SPH) method has performed well for solving particle motion. A lack of research on interphase interaction and air concentration, however, has affected the application of SPH model. In our study, an improved multiphase SPH model is presented to simulate aeration flows. A drag force was included in the momentum equation to ensure accuracy of the air particle slip velocity. Furthermore, a calculation method for air concentration is developed to analyse the air entrainment characteristics. Two studies were used to simulate the hydraulic and air entrainment characteristics. And, compared with the experimental results, the simulation results agree with the experimental results well.

  9. Aircraft Engine Gas Path Diagnostic Methods: Public Benchmarking Results

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Borguet, Sebastien; Leonard, Olivier; Zhang, Xiaodong (Frank)

    2013-01-01

    Recent technology reviews have identified the need for objective assessments of aircraft engine health management (EHM) technologies. To help address this issue, a gas path diagnostic benchmark problem has been created and made publicly available. This software tool, referred to as the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES), has been constructed based on feedback provided by the aircraft EHM community. It provides a standard benchmark problem enabling users to develop, evaluate and compare diagnostic methods. This paper will present an overview of ProDiMES along with a description of four gas path diagnostic methods developed and applied to the problem. These methods, which include analytical and empirical diagnostic techniques, will be described and associated blind-test-case metric results will be presented and compared. Lessons learned along with recommendations for improving the public benchmarking processes will also be presented and discussed.

  10. Behind the Final Grade in Hybrid v. Traditional Courses: Comparing Student Performance by Assessment Type, Core Competency, and Course Objective

    ERIC Educational Resources Information Center

    Bain, Lisa Z.

    2012-01-01

    There are many different delivery methods used by institutions of higher education. These include traditional, hybrid, and online course offerings. The comparisons of these typically use final grade as the measure of student performance. This research study looks behind the final grade and compares student performance by assessment type, core…

  11. A comparison and update of direct kinematic-kinetic models of leg stiffness in human running.

    PubMed

    Liew, Bernard X W; Morris, Susan; Masters, Ashleigh; Netto, Kevin

    2017-11-07

    Direct kinematic-kinetic modelling currently represents the "Gold-standard" in leg stiffness quantification during three-dimensional (3D) motion capture experiments. However, the medial-lateral components of ground reaction force and leg length have been neglected in current leg stiffness formulations. It is unknown if accounting for all 3D would alter healthy biologic estimates of leg stiffness, compared to present direct modelling methods. This study compared running leg stiffness derived from a new method (multiplanar method) which includes all three Cartesian axes, against current methods which either only include the vertical axis (line method) or only the plane of progression (uniplanar method). Twenty healthy female runners performed shod overground running at 5.0 m/s. Three-dimensional motion capture and synchronised in-ground force plates were used to track the change in length of the leg vector (hip joint centre to centre of pressure) and resultant projected ground reaction force. Leg stiffness was expressed as dimensionless units, as a percentage of an individual's bodyweight divided by standing leg length (BW/LL). Leg stiffness using the line method was larger than the uniplanar method by 15.6%BW/LL (P < .001), and multiplanar method by 24.2%BW/LL (P < .001). Leg stiffness from the uniplanar method was larger than the multiplanar method by 8.5%BW/LL (6.5 kN/m) (P < .001). The inclusion of medial-lateral components significantly increased leg deformation magnitude, accounting for the reduction in leg stiffness estimate with the multiplanar method. Given that limb movements typically occur in 3D, the new multiplanar method provides the most complete accounting of all force and length components in leg stiffness calculation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. 120: THE CLINICAL EFFECTIVENESS AND COST-EFFECTIVENESS OF FRACTIONAL CO2 LASER IN ACNE SCARS AND SKIN REJUVENATION: A SYSTEMATIC REVIEW AND ECONOMIC EVALUATION

    PubMed Central

    Yaaghoobian, Barmak; Sadeghi-Ghyassi, Fatemeh; Hajebrahimi, Sakineh

    2017-01-01

    Background and aims Skin rejuvenation is one of high demand cosmetic interventions in Iran. Fractional CO2 Laser is a high power ablative laser which has variety of utilization in medicine including treatment of acne scars and rejuvenation. The aim of this study was to evaluate the safety, efficacy, and cost-effectiveness of Fractional CO2 Laser in comparison with other methods of rejuvenation and acne scar treatment. Methods A systematic database search including Medline (via OVID and PubMed), EMBASE, CINHAL, Cochrane Library, CRD, SCOPUS and Web of Science conducted. After screening search results, selected publications appraised by CASP and Cochrane Collaboration's tool for assessing risk of bias and eligible studies included in the systematic review. In economic evaluation, all costs and benefits analyzed from Iran ministry of health's perspective. Results From 2667 publications, two randomized control trials were eligible and included in the study. The affectivity and complications of Fractional CO2 laser were comparable with Er: YAG but Fractional CO2 laser was 14.7% (P=0.01) more effective than Q-Switched ND: YAG laser. Cost affectivity of this method was the same as other alternative lasers. Conclusions Fractional CO2 laser is an effective and safe method for curing several kinds of skin. Never the less there was not sufficient evidence to support its advantage. This device has equal or lower price in comparison to competent technologies except for the non- fractional ablative Co2 laser that has the same or lower price and comparable effects.

  13. Comparative study of solar optics for paraboloidal concentrators

    NASA Technical Reports Server (NTRS)

    Wen, L.; Poon, P.; Carley, W.; Huang, L.

    1979-01-01

    Different analytical methods for computing the flux distribution on the focal plane of a paraboloidal solar concentrator are reviewed. An analytical solution in algebraic form is also derived for an idealized model. The effects resulting from using different assumptions in the definition of optical parameters used in these methodologies are compared and discussed in detail. These parameters include solar irradiance distribution (limb darkening and circumsolar), reflector surface specular spreading, surface slope error, and concentrator pointing inaccuracy. The type of computational method selected for use depends on the maturity of the design and the data available at the time the analysis is made.

  14. Comparative study of minutiae selection algorithms for ISO fingerprint templates

    NASA Astrophysics Data System (ADS)

    Vibert, B.; Charrier, C.; Le Bars, J.-M.; Rosenberger, C.

    2015-03-01

    We address the selection of fingerprint minutiae given a fingerprint ISO template. Minutiae selection plays a very important role when a secure element (i.e. a smart-card) is used. Because of the limited capability of computation and memory, the number of minutiae of a stored reference in the secure element is limited. We propose in this paper a comparative study of 6 minutiae selection methods including 2 methods from the literature and 1 like reference (No Selection). Experimental results on 3 fingerprint databases from the Fingerprint Verification Competition show their relative efficiency in terms of performance and computation time.

  15. Text Summarization Model based on Maximum Coverage Problem and its Variant

    NASA Astrophysics Data System (ADS)

    Takamura, Hiroya; Okumura, Manabu

    We discuss text summarization in terms of maximum coverage problem and its variant. To solve the optimization problem, we applied some decoding algorithms including the ones never used in this summarization formulation, such as a greedy algorithm with performance guarantee, a randomized algorithm, and a branch-and-bound method. We conduct comparative experiments. On the basis of the experimental results, we also augment the summarization model so that it takes into account the relevance to the document cluster. Through experiments, we showed that the augmented model is at least comparable to the best-performing method of DUC'04.

  16. Titan Density Reconstruction Using Radiometric and Cassini Attitude Control Flight Data

    NASA Technical Reports Server (NTRS)

    Andrade, Luis G., Jr.; Burk, Thomas A.

    2015-01-01

    This paper compares three different methods of Titan atmospheric density reconstruction for the Titan 87 Cassini flyby. T87 was a unique flyby that provided independent Doppler radiometric measurements on the ground throughout the flyby including at Titan closest approach. At the same time, the onboard accelerometer provided an independent estimate of atmospheric drag force and density during the flyby. These results are compared with the normal method of reconstructing atmospheric density using thruster on-time and angular momentum accumulation. Differences between the estimates are analyzed and a possible explanation for the differences is evaluated.

  17. Comparison and feasibility of North American methods for harvesting small trees and residues for energy

    Treesearch

    Bruce R. Hartsough; Bryce J. Stokes

    1990-01-01

    A database of North American harvesting systems was developed. Parameters for each system included site, material and product characteristics, equipment mix and production rate. Onto-truck and delivered costs per green tonne, and breakeven oil prices were developed using standard costing methods. Systems costs were compared over the ranges of piece size, volume per...

  18. Method and apparatus for extraction of low-frequency artifacts from brain waves for alertness detection

    DOEpatents

    Clapp, N.E.; Hively, L.M.

    1997-05-06

    Methods and apparatus automatically detect alertness in humans by monitoring and analyzing brain wave signals. Steps include: acquiring the brain wave (EEG or MEG) data from the subject, digitizing the data, separating artifact data from raw data, and comparing trends in f-data to alertness indicators, providing notification of inadequate alertness. 4 figs.

  19. Double row equivalent for rotator cuff repair: A biomechanical analysis of a new technique.

    PubMed

    Robinson, Sean; Krigbaum, Henry; Kramer, Jon; Purviance, Connor; Parrish, Robin; Donahue, Joseph

    2018-06-01

    There are numerous configurations of double row fixation for rotator cuff tears however, there remains to be a consensus on the best method. In this study, we evaluated three different double-row configurations, including a new method. Our primary question is whether the new anchor and technique compares in biomechanical strength to standard double row techniques. Eighteen prepared fresh frozen bovine infraspinatus tendons were randomized to one of three groups including the New Double Row Equivalent, Arthrex Speedbridge and a transosseous equivalent using standard Stabilynx anchors. Biomechanical testing was performed on humeri sawbones and ultimate load, strain, yield strength, contact area, contact pressure, and a survival plots were evaluated. The new double row equivalent method demonstrated increased survival as well as ultimate strength at 415N compared to the remainder testing groups as well as equivalent contact area and pressure to standard double row techniques. This new anchor system and technique demonstrated higher survival rates and loads to failure than standard double row techniques. This data provides us with a new method of rotator cuff fixation which should be further evaluated in the clinical setting. Basic science biomechanical study.

  20. Prosthodontics an “arsenal” in forensic dentistry

    PubMed Central

    Bathala, Lakshmana Rao; Rachuri, Narendra Kumar; Rayapati, Srinivas Rao; Kondaka, Sudheer

    2016-01-01

    After major disasters such as earthquakes, fires, floods, tsunami, bomb blasts or terrorist attacks, accurate, and early identification of the dead and injured becomes an utmost importance. Restorations, cariesteeth, missingteeth and/or prostheses are most useful aids for the dental identification. At times, only identifiable remains are a victim's partial or complete dentures. The central principle of dental identification is that postmortem dental remains can be compared with antemortem dental records which include, studycasts, radiographs, etc., to confirm the identity of the victims. Marking/labeling dentures have been considered an important aid in forensic dentistry. Other than finger printing, when compared with all the methods, the marking/labeling of dentures is an accurate and rapid method to identify the unknown victims. There are no standardized methods to follow, but dental practitioners needs to maintain some dental records of their patients. This may include documentation of the “marking of dentures.” The preparedness is the key to success in mass disaster identification. The aim of this review article is to discuss the methods of denture identification, advantages of denture labeling for the rapid identification during major disasters/accidents and the importance of maintaining the patient records. PMID:28123274

  1. An Evaluation of Feature Learning Methods for High Resolution Image Classification

    NASA Astrophysics Data System (ADS)

    Tokarczyk, P.; Montoya, J.; Schindler, K.

    2012-07-01

    Automatic image classification is one of the fundamental problems of remote sensing research. The classification problem is even more challenging in high-resolution images of urban areas, where the objects are small and heterogeneous. Two questions arise, namely which features to extract from the raw sensor data to capture the local radiometry and image structure at each pixel or segment, and which classification method to apply to the feature vectors. While classifiers are nowadays well understood, selecting the right features remains a largely empirical process. Here we concentrate on the features. Several methods are evaluated which allow one to learn suitable features from unlabelled image data by analysing the image statistics. In a comparative study, we evaluate unsupervised feature learning with different linear and non-linear learning methods, including principal component analysis (PCA) and deep belief networks (DBN). We also compare these automatically learned features with popular choices of ad-hoc features including raw intensity values, standard combinations like the NDVI, a few PCA channels, and texture filters. The comparison is done in a unified framework using the same images, the target classes, reference data and a Random Forest classifier.

  2. Analysis of Longitudinal Studies With Repeated Outcome Measures: Adjusting for Time-Dependent Confounding Using Conventional Methods.

    PubMed

    Keogh, Ruth H; Daniel, Rhian M; VanderWeele, Tyler J; Vansteelandt, Stijn

    2018-05-01

    Estimation of causal effects of time-varying exposures using longitudinal data is a common problem in epidemiology. When there are time-varying confounders, which may include past outcomes, affected by prior exposure, standard regression methods can lead to bias. Methods such as inverse probability weighted estimation of marginal structural models have been developed to address this problem. However, in this paper we show how standard regression methods can be used, even in the presence of time-dependent confounding, to estimate the total effect of an exposure on a subsequent outcome by controlling appropriately for prior exposures, outcomes, and time-varying covariates. We refer to the resulting estimation approach as sequential conditional mean models (SCMMs), which can be fitted using generalized estimating equations. We outline this approach and describe how including propensity score adjustment is advantageous. We compare the causal effects being estimated using SCMMs and marginal structural models, and we compare the two approaches using simulations. SCMMs enable more precise inferences, with greater robustness against model misspecification via propensity score adjustment, and easily accommodate continuous exposures and interactions. A new test for direct effects of past exposures on a subsequent outcome is described.

  3. Massage, reflexology and other manual methods for pain management in labour.

    PubMed

    Smith, Caroline A; Levett, Kate M; Collins, Carmel T; Jones, Leanne

    2012-02-15

    Many women would like to avoid pharmacological or invasive methods of pain management in labour, and this may contribute towards the popularity of complementary methods of pain management. This review examined currently available evidence supporting the use of manual healing methods including massage and reflexology for pain management in labour. To examine the effects of manual healing methods including massage and reflexology for pain management in labour on maternal and perinatal morbidity. We searched the Cochrane Pregnancy and Childbirth Group's Trials Register (30 June 2011), the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library 2011, Issue 2 of 4), MEDLINE (1966 to 30 June 2011), CINAHL (1980 to 30 June 2011), the Australian and New Zealand Clinical Trial Registry (30 June 2011), Chinese Clinical Trial Register (30 June 2011), Current Controlled Trials (30 June 2011), ClinicalTrials.gov, (30 June 2011) ISRCTN Register (30 June 2011), National Centre for Complementary and Alternative Medicine (NCCAM) (30 June 2011) and the WHO International Clinical Trials Registry Platform (30 June 2011). Randomised controlled trials comparing manual healing methods with standard care, no treatment, other non-pharmacological forms of pain management in labour or placebo. Two authors independently assessed trial quality and extracted data. We attempted to contact study authors for additional information. We included six trials, with data reporting on five trials and 326 women in the meta-analysis. We found trials for massage only. Less pain during labour was reported from massage compared with usual care during the first stage of labour (standardised mean difference (SMD) -0.82, 95% confidence interval (CI) -1.17 to -0.47), four trials, 225 women), and labour pain was reduced in one trial of massage compared with music (risk ratio (RR) 0.40, 95% CI 0.18 to 0.89, 101 women). One trial of massage compared with usual care found reduced anxiety during the first stage of labour (MD -16.27, 95% CI -27.03 to -5.51, 60 women). No trial was assessed as being at a low risk of bias for all quality domains. Massage may have a role in reducing pain, and improving women's emotional experience of labour. However, there is a need for further research.

  4. DISTANCES TO DARK CLOUDS: COMPARING EXTINCTION DISTANCES TO MASER PARALLAX DISTANCES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, Jonathan B.; Jackson, James M.; Stead, Joseph J.

    We test two different methods of using near-infrared extinction to estimate distances to dark clouds in the first quadrant of the Galaxy using large near-infrared (Two Micron All Sky Survey and UKIRT Infrared Deep Sky Survey) surveys. Very long baseline interferometry parallax measurements of masers around massive young stars provide the most direct and bias-free measurement of the distance to these dark clouds. We compare the extinction distance estimates to these maser parallax distances. We also compare these distances to kinematic distances, including recent re-calibrations of the Galactic rotation curve. The extinction distance methods agree with the maser parallax distancesmore » (within the errors) between 66% and 100% of the time (depending on method and input survey) and between 85% and 100% of the time outside of the crowded Galactic center. Although the sample size is small, extinction distance methods reproduce maser parallax distances better than kinematic distances; furthermore, extinction distance methods do not suffer from the kinematic distance ambiguity. This validation gives us confidence that these extinction methods may be extended to additional dark clouds where maser parallaxes are not available.« less

  5. MR/PET quantification tools: Registration, segmentation, classification, and MR-based attenuation correction

    PubMed Central

    Fei, Baowei; Yang, Xiaofeng; Nye, Jonathon A.; Aarsvold, John N.; Raghunath, Nivedita; Cervo, Morgan; Stark, Rebecca; Meltzer, Carolyn C.; Votaw, John R.

    2012-01-01

    Purpose: Combined MR/PET is a relatively new, hybrid imaging modality. A human MR/PET prototype system consisting of a Siemens 3T Trio MR and brain PET insert was installed and tested at our institution. Its present design does not offer measured attenuation correction (AC) using traditional transmission imaging. This study is the development of quantification tools including MR-based AC for quantification in combined MR/PET for brain imaging. Methods: The developed quantification tools include image registration, segmentation, classification, and MR-based AC. These components were integrated into a single scheme for processing MR/PET data. The segmentation method is multiscale and based on the Radon transform of brain MR images. It was developed to segment the skull on T1-weighted MR images. A modified fuzzy C-means classification scheme was developed to classify brain tissue into gray matter, white matter, and cerebrospinal fluid. Classified tissue is assigned an attenuation coefficient so that AC factors can be generated. PET emission data are then reconstructed using a three-dimensional ordered sets expectation maximization method with the MR-based AC map. Ten subjects had separate MR and PET scans. The PET with [11C]PIB was acquired using a high-resolution research tomography (HRRT) PET. MR-based AC was compared with transmission (TX)-based AC on the HRRT. Seventeen volumes of interest were drawn manually on each subject image to compare the PET activities between the MR-based and TX-based AC methods. Results: For skull segmentation, the overlap ratio between our segmented results and the ground truth is 85.2 ± 2.6%. Attenuation correction results from the ten subjects show that the difference between the MR and TX-based methods was <6.5%. Conclusions: MR-based AC compared favorably with conventional transmission-based AC. Quantitative tools including registration, segmentation, classification, and MR-based AC have been developed for use in combined MR/PET. PMID:23039679

  6. Comparing exposure metrics for classifying ‘dangerous heat’ in heat wave and health warning systems

    PubMed Central

    Zhang, Kai; Rood, Richard B.; Michailidis, George; Oswald, Evan M.; Schwartz, Joel D.; Zanobetti, Antonella; Ebi, Kristie L.; O’Neill, Marie S.

    2012-01-01

    Heat waves have been linked to excess mortality and morbidity, and are projected to increase in frequency and intensity with a warming climate. This study compares exposure metrics to trigger heat wave and health warning systems (HHWS), and introduces a novel multi-level hybrid clustering method to identify potential dangerously hot days. Two-level and three-level hybrid clustering analysis as well as common indices used to trigger HHWS, including spatial synoptic classification (SSC); and 90th, 95th, and 99th percentiles of minimum and relative minimum temperature (using a 10 day reference period), were calculated using a summertime weather dataset in Detroit from 1976 to 2006. The days classified as ‘hot’ with hybrid clustering analysis, SSC, minimum and relative minimum temperature methods differed by method type. SSC tended to include the days with, on average, 2.6 °C lower daily minimum temperature and 5.3 °C lower dew point than days identified by other methods. These metrics were evaluated by comparing their performance in predicting excess daily mortality. The 99th percentile of minimum temperature was generally the most predictive, followed by the three-level hybrid clustering method, the 95th percentile of minimum temperature, SSC and others. Our proposed clustering framework has more flexibility and requires less substantial meteorological prior information than the synoptic classification methods. Comparison of these metrics in predicting excess daily mortality suggests that metrics thought to better characterize physiological heat stress by considering several weather conditions simultaneously may not be the same metrics that are better at predicting heat-related mortality, which has significant implications in HHWSs. PMID:22673187

  7. Comparison of Classification Methods for Detecting Emotion from Mandarin Speech

    NASA Astrophysics Data System (ADS)

    Pao, Tsang-Long; Chen, Yu-Te; Yeh, Jun-Heng

    It is said that technology comes out from humanity. What is humanity? The very definition of humanity is emotion. Emotion is the basis for all human expression and the underlying theme behind everything that is done, said, thought or imagined. Making computers being able to perceive and respond to human emotion, the human-computer interaction will be more natural. Several classifiers are adopted for automatically assigning an emotion category, such as anger, happiness or sadness, to a speech utterance. These classifiers were designed independently and tested on various emotional speech corpora, making it difficult to compare and evaluate their performance. In this paper, we first compared several popular classification methods and evaluated their performance by applying them to a Mandarin speech corpus consisting of five basic emotions, including anger, happiness, boredom, sadness and neutral. The extracted feature streams contain MFCC, LPCC, and LPC. The experimental results show that the proposed WD-MKNN classifier achieves an accuracy of 81.4% for the 5-class emotion recognition and outperforms other classification techniques, including KNN, MKNN, DW-KNN, LDA, QDA, GMM, HMM, SVM, and BPNN. Then, to verify the advantage of the proposed method, we compared these classifiers by applying them to another Mandarin expressive speech corpus consisting of two emotions. The experimental results still show that the proposed WD-MKNN outperforms others.

  8. Reinforcement learning algorithms for robotic navigation in dynamic environments.

    PubMed

    Yen, Gary G; Hickey, Travis W

    2004-04-01

    The purpose of this study was to examine improvements to reinforcement learning (RL) algorithms in order to successfully interact within dynamic environments. The scope of the research was that of RL algorithms as applied to robotic navigation. Proposed improvements include: addition of a forgetting mechanism, use of feature based state inputs, and hierarchical structuring of an RL agent. Simulations were performed to evaluate the individual merits and flaws of each proposal, to compare proposed methods to prior established methods, and to compare proposed methods to theoretically optimal solutions. Incorporation of a forgetting mechanism did considerably improve the learning times of RL agents in a dynamic environment. However, direct implementation of a feature-based RL agent did not result in any performance enhancements, as pure feature-based navigation results in a lack of positional awareness, and the inability of the agent to determine the location of the goal state. Inclusion of a hierarchical structure in an RL agent resulted in significantly improved performance, specifically when one layer of the hierarchy included a feature-based agent for obstacle avoidance, and a standard RL agent for global navigation. In summary, the inclusion of a forgetting mechanism, and the use of a hierarchically structured RL agent offer substantially increased performance when compared to traditional RL agents navigating in a dynamic environment.

  9. Smoking Cessation Awareness and Utilization Among Lesbian, Gay, Bisexual, and Transgender Adults: An Analysis of the 2009–2010 National Adult Tobacco Survey

    PubMed Central

    Lee, Youn Ok; Bennett, Keisa; Goodin, Amie

    2016-01-01

    Introduction: Each year, there are more than 480 000 deaths in the United States attributed to smoking. Lesbian, gay, bisexual and transgender (LGBT) adults are a vulnerable population that smokes at higher rates than heterosexuals. Methods: We used data collected from the National Adult Tobacco Survey 2009–2010, a large, nationally representative study using a randomized, national sample of US landline and cellular telephone listings, (N = 118 590). We compared LGBT adults to their heterosexual counterparts with regard to exposure to advertisements promoting smoking cessation, and awareness and use of tobacco treatment services, including quitlines, smoking cessation classes, health professional counseling, nicotine replacement therapy, and medications. Results: Fewer GBT men, compared to heterosexual men, were aware of the quitline. However, LGBT individuals have similar exposure to tobacco cessation advertising, as well as similar awareness of and use of evidence based cessation methods as compared to heterosexual peers. Conclusions: The similarly of awareness and use of cessation support indicates a need for LGBT-specific efforts to reduce smoking disparities. Potential interventions would include: improving awareness of, access to and acceptability of current cessation methods for LGBT patients, developing tailored cessation interventions, and denormalizing smoking in LGBT community spaces. PMID:26014455

  10. Comparing 3D foot scanning with conventional measurement methods.

    PubMed

    Lee, Yu-Chi; Lin, Gloria; Wang, Mao-Jiun J

    2014-01-01

    Foot dimension information on different user groups is important for footwear design and clinical applications. Foot dimension data collected using different measurement methods presents accuracy problems. This study compared the precision and accuracy of the 3D foot scanning method with conventional foot dimension measurement methods including the digital caliper, ink footprint and digital footprint. Six commonly used foot dimensions, i.e. foot length, ball of foot length, outside ball of foot length, foot breadth diagonal, foot breadth horizontal and heel breadth were measured from 130 males and females using four foot measurement methods. Two-way ANOVA was performed to evaluate the sex and method effect on the measured foot dimensions. In addition, the mean absolute difference values and intra-class correlation coefficients (ICCs) were used for precision and accuracy evaluation. The results were also compared with the ISO 20685 criteria. The participant's sex and the measurement method were found (p < 0.05) to exert significant effects on the measured six foot dimensions. The precision of the 3D scanning measurement method with mean absolute difference values between 0.73 to 1.50 mm showed the best performance among the four measurement methods. The 3D scanning measurements showed better measurement accuracy performance than the other methods (mean absolute difference was 0.6 to 4.3 mm), except for measuring outside ball of foot length and foot breadth horizontal. The ICCs for all six foot dimension measurements among the four measurement methods were within the 0.61 to 0.98 range. Overall, the 3D foot scanner is recommended for collecting foot anthropometric data because it has relatively higher precision, accuracy and robustness. This finding suggests that when comparing foot anthropometric data among different references, it is important to consider the differences caused by the different measurement methods.

  11. An adaptive approach to the dynamic allocation of buffer storage. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Crooke, S. C.

    1970-01-01

    Several strategies for the dynamic allocation of buffer storage are simulated and compared. The basic algorithms investigated, using actual statistics observed in the Univac 1108 EXEC 8 System, include the buddy method and the first-fit method. Modifications are made to the basic methods in an effort to improve and to measure allocation performance. A simulation model of an adaptive strategy is developed which permits interchanging the two different methods, the buddy and the first-fit methods with some modifications. Using an adaptive strategy, each method may be employed in the statistical environment in which its performance is superior to the other method.

  12. Method and apparatus for detecting timing errors in a system oscillator

    DOEpatents

    Gliebe, Ronald J.; Kramer, William R.

    1993-01-01

    A method of detecting timing errors in a system oscillator for an electronic device, such as a power supply, includes the step of comparing a system oscillator signal with a delayed generated signal and generating a signal representative of the timing error when the system oscillator signal is not identical to the delayed signal. An LED indicates to an operator that a timing error has occurred. A hardware circuit implements the above-identified method.

  13. System and method for authentication

    DOEpatents

    Duerksen, Gary L.; Miller, Seth A.

    2015-12-29

    Described are methods and systems for determining authenticity. For example, the method may include providing an object of authentication, capturing characteristic data from the object of authentication, deriving authentication data from the characteristic data of the object of authentication, and comparing the authentication data with an electronic database comprising reference authentication data to provide an authenticity score for the object of authentication. The reference authentication data may correspond to one or more reference objects of authentication other than the object of authentication.

  14. Beyond Blood Culture and Gram Stain Analysis: A Review of Molecular Techniques for the Early Detection of Bacteremia in Surgical Patients.

    PubMed

    Scerbo, Michelle H; Kaplan, Heidi B; Dua, Anahita; Litwin, Douglas B; Ambrose, Catherine G; Moore, Laura J; Murray, Col Clinton K; Wade, Charles E; Holcomb, John B

    2016-06-01

    Sepsis from bacteremia occurs in 250,000 cases annually in the United States, has a mortality rate as high as 60%, and is associated with a poorer prognosis than localized infection. Because of these high figures, empiric antibiotic administration for patients with systemic inflammatory response syndrome (SIRS) and suspected infection is the second most common indication for antibiotic administration in intensive care units (ICU)s. However, overuse of empiric antibiotics contributes to the development of opportunistic infections, antibiotic resistance, and the increase in multi-drug-resistant bacterial strains. The current method of diagnosing and ruling out bacteremia is via blood culture (BC) and Gram stain (GS) analysis. Conventional and molecular methods for diagnosing bacteremia were reviewed and compared. The clinical implications, use, and current clinical trials of polymerase chain reaction (PCR)-based methods to detect bacterial pathogens in the blood stream were detailed. BC/GS has several disadvantages. These include: some bacteria do not grow in culture media; others do not GS appropriately; and cultures can require up to 5 d to guide or discontinue antibiotic treatment. PCR-based methods can be potentially applied to detect rapidly, accurately, and directly microbes in human blood samples. Compared with the conventional BC/GS, particular advantages to molecular methods (specifically, PCR-based methods) include faster results, leading to possible improved antibiotic stewardship when bacteremia is not present.

  15. Enhanced regeneration potential of mobilized dental pulp stem cells from immature teeth.

    PubMed

    Nakayama, H; Iohara, K; Hayashi, Y; Okuwa, Y; Kurita, K; Nakashima, M

    2017-07-01

    We have previously demonstrated that dental pulp stem cells (DPSCs) isolated from mature teeth by granulocyte colony-stimulating factor (G-CSF)-induced mobilization method can enhance angiogenesis/vasculogenesis and improve pulp regeneration when compared with colony-derived DPSCs. However, the efficacy of this method in immature teeth with root-formative stage has never been investigated. Therefore, the aim of this study was to examine the stemness, biological characteristics, and regeneration potential in mobilized DPSCs compared with colony-derived DPSCs from immature teeth. Mobilized DPSCs isolated from immature teeth were compared to colony-derived DPSCs using methods including flow cytometry, migration assays, mRNA expression of angiogenic/neurotrophic factor, and induced differentiation assays. They were also compared in trophic effects of the secretome. Regeneration potential was further compared in an ectopic tooth transplantation model. Mobilized DPSCs had higher migration ability and expressed more angiogenic/neurotrophic factors than DPSCs. The mobilized DPSC secretome produced a higher stimulatory effect on migration, immunomodulation, anti-apoptosis, endothelial differentiation, and neurite extension. In addition, vascularization and pulp regeneration potential were higher in mobilized DPSCs than in DPSCs. G-CSF-induced mobilization method enhances regeneration potential of colony-derived DPSCs from immature teeth. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Validation sampling can reduce bias in healthcare database studies: an illustration using influenza vaccination effectiveness

    PubMed Central

    Nelson, Jennifer C.; Marsh, Tracey; Lumley, Thomas; Larson, Eric B.; Jackson, Lisa A.; Jackson, Michael

    2014-01-01

    Objective Estimates of treatment effectiveness in epidemiologic studies using large observational health care databases may be biased due to inaccurate or incomplete information on important confounders. Study methods that collect and incorporate more comprehensive confounder data on a validation cohort may reduce confounding bias. Study Design and Setting We applied two such methods, imputation and reweighting, to Group Health administrative data (full sample) supplemented by more detailed confounder data from the Adult Changes in Thought study (validation sample). We used influenza vaccination effectiveness (with an unexposed comparator group) as an example and evaluated each method’s ability to reduce bias using the control time period prior to influenza circulation. Results Both methods reduced, but did not completely eliminate, the bias compared with traditional effectiveness estimates that do not utilize the validation sample confounders. Conclusion Although these results support the use of validation sampling methods to improve the accuracy of comparative effectiveness findings from healthcare database studies, they also illustrate that the success of such methods depends on many factors, including the ability to measure important confounders in a representative and large enough validation sample, the comparability of the full sample and validation sample, and the accuracy with which data can be imputed or reweighted using the additional validation sample information. PMID:23849144

  17. Studies of superresolution range-Doppler imaging

    NASA Astrophysics Data System (ADS)

    Zhu, Zhaoda; Ye, Zhenru; Wu, Xiaoqing; Yin, Jun; She, Zhishun

    1993-02-01

    This paper presents three superresolution imaging methods, including the linear prediction data extrapolation DFT (LPDEDFT), the dynamic optimization linear least squares (DOLLS), and the Hopfield neural network nonlinear least squares (HNNNLS). Live data of a metalized scale model B-52 aircraft, mounted on a rotating platform in a microwave anechoic chamber, have in this way been processed, as has a flying Boeing-727 aircraft. The imaging results indicate that, compared to the conventional Fourier method, either higher resolution for the same effective bandwidth of transmitted signals and total rotation angle in imaging, or equal-quality images from smaller bandwidth and total rotation, angle may be obtained by these superresolution approaches. Moreover, these methods are compared in respect of their resolution capability and computational complexity.

  18. Comparison of conventional therapies for dentin hypersensitivity versus medical hypnosis.

    PubMed

    Eitner, Stephan; Bittner, Christian; Wichmann, Manfred; Nickenig, Hans-Joachim; Sokol, Biljana

    2010-10-01

    This study compared the efficacy of conventional treatments for dentin hypersensitivity (DHS) and hypnotherapy. During a 1-month period at an urban practice in a service area of approximately 22,000 inhabitants, all patients were examined. A total of 102 individuals were included in the evaluation. Values of 186 teeth were analyzed. The comparison of the different treatment methods (desensitizer, fluoridation, and hypnotherapy) did not show significant differences in success rates. However, a noticeable difference was observed in terms of onset and duration of effect. For both desensitizer and hypnotherapy treatments, onset of effect was very rapid. Compared to the other methods studied, hypnotherapy effects had the longest duration. In conclusion, hypnotherapy was as effective as other methods in the treatment of DHS.

  19. Comparison between Greulich-Pyle and Girdany-Golden methods for estimating skeletal age of children in Pakistan.

    PubMed

    Awais, Muhammad; Nadeem, Naila; Husen, Yousuf; Rehman, Abdul; Beg, Madiha; Khattak, Yasir Jamil

    2014-12-01

    To compare Greulich-Pyle (GP) and Girdany-Golden (GG) methods for estimation of Skeletal Age (SA) in children referred to a tertiary care hospital in Karachi, Pakistan. Cross-sectional study. Department of Radiology, The Aga Khan University Hospital, Karachi, Pakistan, from July 2010 to June 2012. Children up to the age of 18 years, who had undergone X-ray for the evaluation of trauma were included. Each X-ray was interpreted using both methods by two consultant paediatric radiologists having at least 10 years experience, who were blinded to the actual Chronologic Age (CA) of children. A total of 283 children were included. No significant difference was noted in mean SA estimated by GP method and mean CA for female children (p=0.695). However, a significant difference was noted between mean CA and mean SA by GG method for females (p=0.011). For males, there was a significant difference between mean CA and mean SA estimated by both GP and GG methods. A stronger correlation was found between CA and SA estimated by GP method (r=0.943 for girls, r=0.915 for boys) as compared to GG method (r=0.909 for girls, r=0.865 for boys) respectively. Bland- Altman analysis also revealed that the two methods cannot be used interchangeably. Excellent correlation was seen between the two readers for both GP and GG methods. There was no additional benefit of using GP and GG methods simultaneously over using GP method alone. Moreover, although GP was reliable in estimating SA in girls, it was unable to accurately assess SA in boys. Therefore, it would be ideal to develop indigenous standards of bone age estimation based on a representative sample of healthy native children.

  20. Comparison of the Calculations Results of Heat Exchange Between a Single-Family Building and the Ground Obtained with the Quasi-Stationary and 3-D Transient Models. Part 2: Intermittent and Reduced Heating Mode

    NASA Astrophysics Data System (ADS)

    Staszczuk, Anna

    2017-03-01

    The paper provides comparative results of calculations of heat exchange between ground and typical residential buildings using simplified (quasi-stationary) and more accurate (transient, three-dimensional) methods. Such characteristics as building's geometry, basement hollow and construction of ground touching assemblies were considered including intermittent and reduced heating mode. The calculations with simplified methods were conducted in accordance with currently valid norm: PN-EN ISO 13370:2008. Thermal performance of buildings. Heat transfer via the ground. Calculation methods. Comparative estimates concerning transient, 3-D, heat flow were performed with computer software WUFI®plus. The differences of heat exchange obtained using more exact and simplified methods have been specified as a result of the analysis.

  1. Incipient fire detection system

    DOEpatents

    Brooks, Jr., William K.

    1999-01-01

    A method and apparatus for an incipient fire detection system that receives gaseous samples and measures the light absorption spectrum of the mixture of gases evolving from heated combustibles includes a detector for receiving gaseous samples and subjecting the samples to spectroscopy and determining wavelengths of absorption of the gaseous samples. The wavelengths of absorption of the gaseous samples are compared to predetermined absorption wavelengths. A warning signal is generated whenever the wavelengths of absorption of the gaseous samples correspond to the predetermined absorption wavelengths. The method includes receiving gaseous samples, subjecting the samples to light spectroscopy, determining wavelengths of absorption of the gaseous samples, comparing the wavelengths of absorption of the gaseous samples to predetermined absorption wavelengths and generating a warning signal whenever the wavelengths of absorption of the gaseous samples correspond to the predetermined absorption wavelengths. In an alternate embodiment, the apparatus includes a series of channels fluidically connected to a plurality of remote locations. A pump is connected to the channels for drawing gaseous samples into the channels. A detector is connected to the channels for receiving the drawn gaseous samples and subjecting the samples to spectroscopy. The wavelengths of absorption are determined and compared to predetermined absorption wavelengths is provided. A warning signal is generated whenever the wavelengths correspond.

  2. Simultaneous determination of nine saponins from Panax notoginseng using HPLC and pressurized liquid extraction.

    PubMed

    Wan, J B; Lai, C M; Li, S P; Lee, M Y; Kong, L Y; Wang, Y T

    2006-04-11

    A HPLC and pressurized liquid extraction (PLE) method was developed for simultaneous determination of nine saponins, including notoginsenoside R1, ginsenoside Rg1, Re, Rf, Rb1, Rc, Rb2, Rb3 and Rd in Panax notoginseng. The analysis was performed on C18 column with water-acetonitrile gradient elution and the investigated saponins were authenticated by comparing retention time and mass spectra with their reference compounds. Several methods including PLE, ultrasonication, soxhlet extraction and immersion were used for sample preparation and their extraction efficiency was compared. The results showed that PLE has the highest extraction efficiency and repeatability, which would be valuable on standardization of sample preparation for quality control of Chinese medicines. The developed HPLC and PLE is an effective approach for simultaneously quantitative determination of sapoinins in P. notoginseng, which could be used for quality control of P. notoginseng and its preparations.

  3. Hydration and Cooling Practices Among Farmworkers in Oregon and Washington

    PubMed Central

    Bethel, Jeffrey W.; Spector, June T.; Krenz, Jennifer

    2018-01-01

    Objectives Although recommendations for preventing occupational heat-related illness among farmworkers include hydration and cooling practices, the extent to which these recommendations are universally practiced is unknown. The objective of this analysis was to compare hydration and cooling practices between farmworkers in Oregon and Washington. Methods A survey was administered to a purposive sample of Oregon and Washington farmworkers. Data collected included demographics, work history and current work practices, hydration practices, access and use of cooling measures, and headwear and clothing worn. Results Oregon farmworkers were more likely than those in Washington to consume beverages containing sugar and/or caffeine. Workers in Oregon more frequently reported using various cooling measures compared with workers in Washington. Availability of cooling measures also varied between the two states. Conclusions These results highlight the large variability between workers in two states regarding access to and use of methods to stay cool while working in the heat. PMID:28402203

  4. Electronic Cigarettes for Smoking Cessation.

    PubMed

    Orellana-Barrios, Menfil A; Payne, Drew; Medrano-Juarez, Rita M; Yang, Shengping; Nugent, Kenneth

    2016-10-01

    The use of electronic cigarettes (e-cigarettes) is increasing, but their use as a smoking-cessation aid is controversial. The reporting of e-cigarette studies on cessation is variable and inconsistent. To date, only 1 randomized clinical trial has included an arm with other cessation methods (nicotine patches). The cessation rates for available clinical trials are difficult to compare given differing follow-up periods and broad ranges (4% at 12 months with non-nicotine e-cigarettes to 68% at 4 weeks with concomitant nicotine e-cigarettes and other cessation methods). The average combined abstinence rate for included prospective studies was 29.1% (combination of 6-18 months׳ rates). There are few comparable clinical trials and prospective studies related to e-cigarettes use for smoking cessation, despite an increasing number of citations. Larger randomized clinical trials are essential to determine whether e-cigarettes are effective smoking-cessation devices. Copyright © 2016 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.

  5. A comparison of interpolation methods on the basis of data obtained from a bathymetric survey of Lake Vrana, Croatia

    NASA Astrophysics Data System (ADS)

    Šiljeg, A.; Lozić, S.; Šiljeg, S.

    2014-12-01

    The bathymetric survey of Lake Vrana included a wide range of activities that were performed in several different stages, in accordance with the standards set by the International Hydrographic Organization. The survey was conducted using an integrated measuring system which consisted of three main parts: a single-beam sonar Hydrostar 4300, GPS devices Ashtech Promark 500 - base, and a Thales Z-Max - rover. A total of 12 851 points were gathered. In order to find continuous surfaces necessary for analysing the morphology of the bed of Lake Vrana, it was necessary to approximate values in certain areas that were not directly measured, by using an appropriate interpolation method. The main aims of this research were as follows: to compare the efficiency of 16 different interpolation methods, to discover the most appropriate interpolators for the development of a raster model, to calculate the surface area and volume of Lake Vrana, and to compare the differences in calculations between separate raster models. The best deterministic method of interpolation was ROF multi-quadratic, and the best geostatistical, ordinary cokriging. The mean quadratic error in both methods measured less than 0.3 m. The quality of the interpolation methods was analysed in 2 phases. The first phase used only points gathered by bathymetric measurement, while the second phase also included points gathered by photogrammetric restitution. The first bathymetric map of Lake Vrana in Croatia was produced, as well as scenarios of minimum and maximum water levels. The calculation also included the percentage of flooded areas and cadastre plots in the case of a 2 m increase in the water level. The research presented new scientific and methodological data related to the bathymetric features, surface area and volume of Lake Vrana.

  6. Performance analysis of the FDTD method applied to holographic volume gratings: Multi-core CPU versus GPU computing

    NASA Astrophysics Data System (ADS)

    Francés, J.; Bleda, S.; Neipp, C.; Márquez, A.; Pascual, I.; Beléndez, A.

    2013-03-01

    The finite-difference time-domain method (FDTD) allows electromagnetic field distribution analysis as a function of time and space. The method is applied to analyze holographic volume gratings (HVGs) for the near-field distribution at optical wavelengths. Usually, this application requires the simulation of wide areas, which implies more memory and time processing. In this work, we propose a specific implementation of the FDTD method including several add-ons for a precise simulation of optical diffractive elements. Values in the near-field region are computed considering the illumination of the grating by means of a plane wave for different angles of incidence and including absorbing boundaries as well. We compare the results obtained by FDTD with those obtained using a matrix method (MM) applied to diffraction gratings. In addition, we have developed two optimized versions of the algorithm, for both CPU and GPU, in order to analyze the improvement of using the new NVIDIA Fermi GPU architecture versus highly tuned multi-core CPU as a function of the size simulation. In particular, the optimized CPU implementation takes advantage of the arithmetic and data transfer streaming SIMD (single instruction multiple data) extensions (SSE) included explicitly in the code and also of multi-threading by means of OpenMP directives. A good agreement between the results obtained using both FDTD and MM methods is obtained, thus validating our methodology. Moreover, the performance of the GPU is compared to the SSE+OpenMP CPU implementation, and it is quantitatively determined that a highly optimized CPU program can be competitive for a wider range of simulation sizes, whereas GPU computing becomes more powerful for large-scale simulations.

  7. Method for controlling exhaust gas heat recovery systems in vehicles

    DOEpatents

    Spohn, Brian L.; Claypole, George M.; Starr, Richard D

    2013-06-11

    A method of operating a vehicle including an engine, a transmission, an exhaust gas heat recovery (EGHR) heat exchanger, and an oil-to-water heat exchanger providing selective heat-exchange communication between the engine and transmission. The method includes controlling a two-way valve, which is configured to be set to one of an engine position and a transmission position. The engine position allows heat-exchange communication between the EGHR heat exchanger and the engine, but does not allow heat-exchange communication between the EGHR heat exchanger and the oil-to-water heat exchanger. The transmission position allows heat-exchange communication between the EGHR heat exchanger, the oil-to-water heat exchanger, and the engine. The method also includes monitoring an ambient air temperature and comparing the monitored ambient air temperature to a predetermined cold ambient temperature. If the monitored ambient air temperature is greater than the predetermined cold ambient temperature, the two-way valve is set to the transmission position.

  8. Methods of milk expression for lactating women.

    PubMed

    Becker, Genevieve E; Smith, Hazel A; Cooney, Fionnuala

    2016-09-29

    Breastfeeding is important, however not all infants can feed at the breast and methods of expressing milk need evaluation. To assess acceptability, effectiveness, safety, effect on milk composition, contamination and costs of methods of milk expression. We searched the Cochrane Pregnancy and Childbirth Group's Trials Register (21 March 2016), handsearched relevant journals and conference proceedings, and contacted experts in the field to seek additional published or unpublished studies. We also examined reference lists of all relevant retrieved papers. Randomised and quasi-randomised trials comparing methods at any time after birth. Three review authors independently assessed trials for inclusion and risk of bias, extracted data and checked them for accuracy. This updated review includes 41 trials involving 2293 participants, with 22 trials involving 1339 participants contributing data for analysis. Twenty-six of the trials referred to mothers of infants in neonatal units (n = 1547) and 14 to mothers of healthy infants at home (n = 730), with one trial containing mothers of both neonatal and healthy older infants (n = 16). Eleven trials compared one or more types of pump versus hand expression and 14 studies compared one type of pump versus another type of pump, with three of these studies comparing both hand expression and pump types. Twenty studies compared a specific protocol or adjunct behaviour including sequential versus simultaneous pumping protocols, pumping frequency, provision of an education and support intervention, relaxation, breast massage, combining hand expression with pumping and a breast cleansing protocol.Due to heterogeneity in participants, interventions, and outcomes measured or reported, we were unable to pool findings for most of the specified outcomes. It was not possible therefore to produce a 'Summary of findings' table in this update. Most of the included results were derived from single studies. Trials took place in 14 countries under a variety of circumstances and were published from 1982 to 2015. Sixteen of the 30 trials that evaluated pumps or products had support from the manufacturers. The risk of bias of the included studies was variable. Primary outcomesOnly one of the 17 studies examining maternal satisfaction/acceptability with the method or adjunct behaviour provided data suitable for analysis. In this study, self-efficacy was assessed by asking mothers if they agreed or disagreed with the following statement: 'I don't want anyone to see me (hand expressing/pumping)'. The study found that mothers who were using the electric pump were more likely to agree with the statement compared to mothers hand expressing, (mean difference (MD) 0.70, 95% confidence interval (CI) 0.15 to 1.25; P = 0.01, participants = 68). Mothers who were hand expressing reported that the instructions for expression were clearer compared to the electric pump, (MD -0.40, 95% CI -0.75 to -0.05; P = 0.02, participants = 68). Descriptive reporting of satisfaction in the other studies varied in the measures used, did not indicate a clear preference for one pump type, although there was satisfaction with some relaxation and support interventions.We found no clinically significant differences between methods related to contamination of the milk that compared any type of pump to hand expression (risk ratio (RR) 1.13, 95% CI 0.79 to 1.61; P = 0.51, participants = 28), manual pump compared to hand expression, (MD 0.20, 95% CI -0.18 to 0.58; P = 0.30, participants = 142) a large electric pump compared to hand expression (MD 0.10, 95% CI -0.29 to 0.49; P = 0.61, participants = 123), or a large electric pump compared to a manual pump (MD -0.10, 95% CI -0.46 to 0.26; P = 0.59, participants = 141).The level of maternal breast or nipple pain or damage was similar in comparisons of a large electric pump to hand expression (MD 0.02, 95% CI -0.67 to 0.71; P = 0.96, participants = 68). A study comparing a manual and large electric pump, reported sore nipples in 7% for both groups and engorgement in 4% using a manual pump versus 6% using an electric pump; and in one study no nipple damage was reported in the hand-expression group, and one case of nipple damage in each of the manual pump and the large electric pump groups.One study examined adverse effects on infants, however as the infants did not all receive their mothers' expressed milk, we have not included the results. Secondary outcomesThe quantity of expressed milk obtained was increased, in some studies by a clinically significant amount, in interventions involving relaxation, music, warmth, massage, initiation of pumping, increased frequency of pumping and suitable breast shield size. Support programmes and simultaneous compared to sequential pumping did not show a difference in milk obtained. No pump consistently increased the milk volume obtained significantly.In relation to nutrient quality, hand expression or a large electric pump were found to provide higher protein than a manual pump, and hand expression provided higher sodium and lower potassium compared to a large electric pump or a manual pump. Fat content was higher with breast massage when pumping; no evidence of difference was found for energy content between methods.No consistent effect was found related to prolactin change or effect on oxytocin release with pump type or method. Economic aspects were not reported. The most suitable method for milk expression may depend on the time since birth, purpose of expression and the individual mother and infant. Low-cost interventions including initiation of milk expression sooner after birth when not feeding at the breast, relaxation, massage, warming the breasts, hand expression and lower cost pumps may be as effective, or more effective, than large electric pumps for some outcomes. Variation in nutrient content across methods may be relevant to some infants. Small sample sizes, large standard deviations, and the diversity of the interventions argue caution in applying these results beyond the specific method tested in the specific settings. Independently funded research is needed for more trials on hand expression, relaxation and other techniques that do not have a commercial potential.

  9. Application of adjusted subpixel method (ASM) in HRCT measurements of the bronchi in bronchial asthma patients and healthy individuals.

    PubMed

    Mincewicz, Grzegorz; Rumiński, Jacek; Krzykowski, Grzegorz

    2012-02-01

    Recently, we described a model system which included corrections of high-resolution computed tomography (HRCT) bronchial measurements based on the adjusted subpixel method (ASM). To verify the clinical application of ASM by comparing bronchial measurements obtained by means of the traditional eye-driven method, subpixel method alone and ASM in a group comprised of bronchial asthma patients and healthy individuals. The study included 30 bronchial asthma patients and the control group comprised of 20 volunteers with no symptoms of asthma. The lowest internal and external diameters of the bronchial cross-sections (ID and ED) and their derivative parameters were determined in HRCT scans using: (1) traditional eye-driven method, (2) subpixel technique, and (3) ASM. In the case of the eye-driven method, lower ID values along with lower bronchial lumen area and its percentage ratio to total bronchial area were basic parameters that differed between asthma patients and healthy controls. In the case of the subpixel method and ASM, both groups were not significantly different in terms of ID. Significant differences were observed in values of ED and total bronchial area with both parameters being significantly higher in asthma patients. Compared to ASM, the eye-driven method overstated the values of ID and ED by about 30% and 10% respectively, while understating bronchial wall thickness by about 18%. Results obtained in this study suggest that the traditional eye-driven method of HRCT-based measurement of bronchial tree components probably overstates the degree of bronchial patency in asthma patients. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  10. Adjunctive social media for more effective contraceptive counseling: a randomized controlled trial.

    PubMed

    Kofinas, Jason D; Varrey, Aneesha; Sapra, Katherine J; Kanj, Rula V; Chervenak, Frank A; Asfaw, Tirsit

    2014-04-01

    To determine whether social media, specifically Facebook, is an effective tool for improving contraceptive knowledge. English-speaking women aged 18-45 years receiving care at an urban academic center obstetrics and gynecology clinic were included and randomized to a trial of standard contraceptive education and pamphlet (n=74) compared with standard contraceptive education and Facebook (n=69) information for contraception counseling. Contraceptive knowledge was evaluated preintervention and postintervention by the Contraceptive Knowledge Inventory. We evaluated the effect of the intervention by raw score and percent increase in Contraceptive Knowledge Inventory score, participant satisfaction with counseling method, and contraceptive preference postintervention. All analyses were stratified by age group. The median raw postintervention Contraceptive Knowledge Inventory score was significantly higher in the Facebook compared with the pamphlet group (15 compared with 12, P<.001) as was percentage increase in the Contraceptive Knowledge Inventory score (36% compared with 12%, P<.001). Participant satisfaction with counseling method was significantly higher in the Facebook group (median 10 compared with 6, P<.001). Participant contraceptive preference for long-acting reversible contraceptives (LARCs; intrauterine device or implant) postintervention was significantly greater in the Facebook compared with the pamphlet group (57% compared with 35%, P=.01). Among women currently using none or barrier contraception, contraceptive preference for implants was significantly greater in the Facebook compared with the pamphlet group (26% compared with 5%, P=.02), although, when analysis was extended to include implant or intrauterine device, LARCs were not significantly higher in the Facebook compared with the pamphlet group (48% compared with 33%, P=.19). Social media as an adjunct to traditional in-office counseling improves patient contraceptive knowledge and increases patient preference for LARCs. ClinicalTrials.gov, www.clinicaltrials.gov, NCT01994005.

  11. Sampling techniques for thrips (Thysanoptera: Thripidae) in preflowering tomato.

    PubMed

    Joost, P Houston; Riley, David G

    2004-08-01

    Sampling techniques for thrips (Thysanoptera: Thripidae) were compared in preflowering tomato plants at the Coastal Plain Experiment Station in Tifton, GA, in 2000 and 2003, to determine the most effective method of determining abundance of thrips on tomato foliage early in the growing season. Three relative sampling techniques, including a standard insect aspirator, a 946-ml beat cup, and an insect vacuum device, were compared for accuracy to an absolute method and to themselves for precision and efficiency of sampling thrips. Thrips counts of all relative sampling methods were highly correlated (R > 0.92) to the absolute method. The aspirator method was the most accurate compared with the absolute sample according to regression analysis in 2000. In 2003, all sampling methods were considered accurate according to Dunnett's test, but thrips numbers were lower and sample variation was greater than in 2000. In 2000, the beat cup method had the lowest relative variation (RV) or best precision, at 1 and 8 d after transplant (DAT). Only the beat cup method had RV values <25 for all sampling dates. In 2003, the beat cup method had the lowest RV value at 15 and 21 DAT. The beat cup method also was the most efficient method for all sample dates in both years. Frankliniella fusca (Pergande) was the most abundant thrips species on the foliage of preflowering tomato in both years of study at this location. Overall, the best thrips sampling technique tested was the beat cup method in terms of precision and sampling efficiency.

  12. A method based on IHS cylindrical transform model for quality assessment of image fusion

    NASA Astrophysics Data System (ADS)

    Zhu, Xiaokun; Jia, Yonghong

    2005-10-01

    Image fusion technique has been widely applied to remote sensing image analysis and processing, and methods for quality assessment of image fusion in remote sensing have also become the research issues at home and abroad. Traditional assessment methods combine calculation of quantitative indexes and visual interpretation to compare fused images quantificationally and qualitatively. However, in the existing assessment methods, there are two defects: on one hand, most imdexes lack the theoretic support to compare different fusion methods. On the hand, there is not a uniform preference for most of the quantitative assessment indexes when they are applied to estimate the fusion effects. That is, the spatial resolution and spectral feature could not be analyzed synchronously by these indexes and there is not a general method to unify the spatial and spectral feature assessment. So in this paper, on the basis of the approximate general model of four traditional fusion methods, including Intensity Hue Saturation(IHS) triangle transform fusion, High Pass Filter(HPF) fusion, Principal Component Analysis(PCA) fusion, Wavelet Transform(WT) fusion, a correlation coefficient assessment method based on IHS cylindrical transform is proposed. By experiments, this method can not only get the evaluation results of spatial and spectral features on the basis of uniform preference, but also can acquire the comparison between fusion image sources and fused images, and acquire differences among fusion methods. Compared with the traditional assessment methods, the new methods is more intuitionistic, and in accord with subjective estimation.

  13. How High is that Dune? A Comparison of Methods Used to Constrain the Morphometry of Aeolian Bedforms on Mars

    NASA Technical Reports Server (NTRS)

    Bourke, M.; Balme, M.; Beyer, R. A.; Williams, K. K.

    2004-01-01

    Methods traditionally used to estimate the relative height of surface features on Mars include: photoclinometry, shadow length and stereography. The MOLA data set enables a more accurate assessment of the surface topography of Mars. However, many small-scale aeolian bedforms remain below the sample resolution of the MOLA data set. In response to this a number of research teams have adopted and refined existing methods and applied them to high resolution (2-6 m/pixel) narrow angle MOC satellite images. Collectively, the methods provide data on a range of morphometric parameters (many not previously available for dunes on Mars). These include dune height, width, length, surface area, volume, longitudinal and cross profiles). This data will facilitate a more accurate analysis of aeolian bedforms on Mars. In this paper we undertake a comparative analysis of methods used to determine the height of aeolian dunes and ripples.

  14. Methodology in the Assessment of Construction and Development Investment Projects, Including the Graphic Multi-Criteria Analysis - a Systemic Approach

    NASA Astrophysics Data System (ADS)

    Szafranko, Elżbieta

    2017-10-01

    Assessment of variant solutions developed for a building investment project needs to be made at the stage of planning. While considering alternative solutions, the investor defines various criteria, but a direct evaluation of the degree of their fulfilment by developed variant solutions can be very difficult. In practice, there are different methods which enable the user to include a large number of parameters into an analysis, but their implementation can be challenging. Some methods require advanced mathematical computations, preceded by complicating input data processing, and the generated results may not lend themselves easily to interpretation. Hence, during her research, the author has developed a systemic approach, which involves several methods and whose goal is to compare their outcome. The final stage of the proposed method consists of graphic interpretation of results. The method has been tested on a variety of building and development projects.

  15. Methodological problems in the use of indirect comparisons for evaluating healthcare interventions: survey of published systematic reviews.

    PubMed

    Song, Fujian; Loke, Yoon K; Walsh, Tanya; Glenny, Anne-Marie; Eastwood, Alison J; Altman, Douglas G

    2009-04-03

    To investigate basic assumptions and other methodological problems in the application of indirect comparison in systematic reviews of competing healthcare interventions. Survey of published systematic reviews. Inclusion criteria Systematic reviews published between 2000 and 2007 in which an indirect approach had been explicitly used. Identified reviews were assessed for comprehensiveness of the literature search, method for indirect comparison, and whether assumptions about similarity and consistency were explicitly mentioned. The survey included 88 review reports. In 13 reviews, indirect comparison was informal. Results from different trials were naively compared without using a common control in six reviews. Adjusted indirect comparison was usually done using classic frequentist methods (n=49) or more complex methods (n=18). The key assumption of trial similarity was explicitly mentioned in only 40 of the 88 reviews. The consistency assumption was not explicit in most cases where direct and indirect evidence were compared or combined (18/30). Evidence from head to head comparison trials was not systematically searched for or not included in nine cases. Identified methodological problems were an unclear understanding of underlying assumptions, inappropriate search and selection of relevant trials, use of inappropriate or flawed methods, lack of objective and validated methods to assess or improve trial similarity, and inadequate comparison or inappropriate combination of direct and indirect evidence. Adequate understanding of basic assumptions underlying indirect and mixed treatment comparison is crucial to resolve these methodological problems. APPENDIX 1: PubMed search strategy. APPENDIX 2: Characteristics of identified reports. APPENDIX 3: Identified studies. References of included studies.

  16. Pre-operative predictive factors for gallbladder cholesterol polyps using conventional diagnostic imaging

    PubMed Central

    Choi, Ji-Hoon; Yun, Jung-Won; Kim, Yong-Sung; Lee, Eun-A; Hwang, Sang-Tae; Cho, Yong-Kyun; Kim, Hong-Joo; Park, Jung-Ho; Park, Dong-Il; Sohn, Chong-Il; Jeon, Woo-Kyu; Kim, Byung-Ik; Kim, Hyoung-Ook; Shin, Jun-Ho

    2008-01-01

    AIM: To determine the clinical data that might be useful for differentiating benign from malignant gallbladder (GB) polyps by comparing radiological methods, including abdominal ultrasonography (US) and computed tomography (CT) scanning, with postoperative pathology findings. METHODS: Fifty-nine patients underwent laparoscopic cholecystectomy for a GB polyp of around 10 mm. They were divided into two groups, one with cholesterol polyps and the other with non-cholesterol polyps. Clinical features such as gender, age, symptoms, size and number of polyps, the presence of a GB stone, the radiologically measured maximum diameter of the polyp by US and CT scanning, and the measurements of diameter from postoperative pathology were recorded for comparative analysis. RESULTS: Fifteen of the 41 cases with cholesterol polyps (36.6%) were detected with US but not CT scanning, whereas all 18 non-cholesterol polyps were observed using both methods. In the cholesterol polyp group, the maximum measured diameter of the polyp was smaller by CT scan than by US. Consequently, the discrepancy between those two scanning measurements was greater than for the non-cholesterol polyp group. CONCLUSION: The clinical signs indicative of a cholesterol polyp include: (1) a polyp observed by US but not observable by CT scanning, (2) a smaller diameter on the CT scan compared to US, and (3) a discrepancy in its maximum diameter between US and CT measurements. In addition, US and the CT scan had low accuracy in predicting the polyp diameter compared to that determined by postoperative pathology. PMID:19058309

  17. [Study on baking processing technology of hui medicine Aconitum flavum].

    PubMed

    Fu, Xue-yan; Zhang, Bai-tong; Li, Ting-ting; Dong, Lin; Hao, Wen-jing; Yu, Liang

    2013-12-01

    To screen and optimize the processing technology of Aconitum flavum. The acute-toxicity, anti-inflammatory and analgesic experiments were used as indexes. Four processing methods, including decoction, streaming, baking and processing with Chebulae Fructus decoction, were compared to screen the optimum processing method for Aconitum flavum. The baking time was also optimized. The optimal baked technology was that 1-2 mm decoction pieces was baked at 105 degrees C for 3 hours. The baking method is proved to be the optimal processing method of Aconitum flavum. It is shown that this method is simple and stable.

  18. Methods of the working processes modelling of an internal combustion engine by an ANSYS IC Engine module

    NASA Astrophysics Data System (ADS)

    Kurchatkin, I. V.; Gorshkalev, A. A.; Blagin, E. V.

    2017-01-01

    This article deals with developed methods of the working processes modelling in the combustion chamber of an internal combustion engine (ICE). Methods includes description of the preparation of a combustion chamber 3-d model, setting of the finite-element mesh, boundary condition setting and solution customization. Aircraft radial engine M-14 was selected for modelling. The cycle of cold blowdown in the ANSYS IC Engine software was carried out. The obtained data were compared to results of known calculation methods. A method of engine’s induction port improvement was suggested.

  19. Nutritional assessment in intravenous drug users with HIV/AIDS.

    PubMed

    Smit, E; Tang, A

    2000-10-01

    Studying metabolic, endocrine, and gastrointestinal (MEG) disorders in drug abuse and HIV infection is important. Equally important, however, are the tools we use to assess these disorders. Assessment of nutritional status may include any combination of biochemical and body composition measurements, dietary intake assessment, and metabolic studies. Each method has its strengths and weaknesses and there is no perfect tool. When assessing nutritional status in injection drug users (IDU) and in HIV-infected people, the decision on which method or methods to use becomes even more complex. A review of studies reported during the XII World Conference on AIDS reveals that of 64 abstracts on the topic of nutrition in HIV-infected adults, only 11 assessed diet, 41 assessed anthropometry, and 24 assessed some form of biochemical measure. The most commonly reported methods for dietary intake included 24-hour recalls, food records, and food frequencies. The commonest methods used for measuring body composition included height, weight, bioimpedance, and dual-energy x-ray absorptiometry (DEXA). Biochemical measurements included various blood nutrients, lipids, and albumin. Methods varied greatly between studies, and caution should be taken when trying to compare results across studies, especially among those using different methods. Currently, few studies deal with the development of methods that can be used for research in HIV-infected and IDU populations. We need to work toward better tools in dietary intake assessment, body composition, and biochemical measurements, especially methods that will allow us to track changes in nutritional status over time.

  20. Potential Impact of Rapid Blood Culture Testing for Gram-Positive Bacteremia in Japan with the Verigene Gram-Positive Blood Culture Test

    PubMed Central

    Matsuda, Mari; Iguchi, Shigekazu; Mizutani, Tomonori; Hiramatsu, Keiichi; Tega-Ishii, Michiru; Sansaka, Kaori; Negishi, Kenta; Shimada, Kimie; Umemura, Jun; Notake, Shigeyuki; Yanagisawa, Hideji; Yabusaki, Reiko; Araoka, Hideki; Yoneyama, Akiko

    2017-01-01

    Background. Early detection of Gram-positive bacteremia and timely appropriate antimicrobial therapy are required for decreasing patient mortality. The purpose of our study was to evaluate the performance of the Verigene Gram-positive blood culture assay (BC-GP) in two special healthcare settings and determine the potential impact of rapid blood culture testing for Gram-positive bacteremia within the Japanese healthcare delivery system. Furthermore, the study included simulated blood cultures, which included a library of well-characterized methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant enterococci (VRE) isolates reflecting different geographical regions in Japan. Methods. A total 347 BC-GP assays were performed on clinical and simulated blood cultures. BC-GP results were compared to results obtained by reference methods for genus/species identification and detection of resistance genes using molecular and MALDI-TOF MS methodologies. Results. For identification and detection of resistance genes at two clinical sites and simulated blood cultures, overall concordance of BC-GP with reference methods was 327/347 (94%). The time for identification and antimicrobial resistance detection by BC-GP was significantly shorter compared to routine testing especially at the cardiology hospital, which does not offer clinical microbiology services on weekends and holidays. Conclusion. BC-GP generated accurate identification and detection of resistance markers compared with routine laboratory methods for Gram-positive organisms in specialized clinical settings providing more rapid results than current routine testing. PMID:28316631

  1. A comparative study on preprocessing techniques in diabetic retinopathy retinal images: illumination correction and contrast enhancement.

    PubMed

    Rasta, Seyed Hossein; Partovi, Mahsa Eisazadeh; Seyedarabi, Hadi; Javadzadeh, Alireza

    2015-01-01

    To investigate the effect of preprocessing techniques including contrast enhancement and illumination correction on retinal image quality, a comparative study was carried out. We studied and implemented a few illumination correction and contrast enhancement techniques on color retinal images to find out the best technique for optimum image enhancement. To compare and choose the best illumination correction technique we analyzed the corrected red and green components of color retinal images statistically and visually. The two contrast enhancement techniques were analyzed using a vessel segmentation algorithm by calculating the sensitivity and specificity. The statistical evaluation of the illumination correction techniques were carried out by calculating the coefficients of variation. The dividing method using the median filter to estimate background illumination showed the lowest Coefficients of variations in the red component. The quotient and homomorphic filtering methods after the dividing method presented good results based on their low Coefficients of variations. The contrast limited adaptive histogram equalization increased the sensitivity of the vessel segmentation algorithm up to 5% in the same amount of accuracy. The contrast limited adaptive histogram equalization technique has a higher sensitivity than the polynomial transformation operator as a contrast enhancement technique for vessel segmentation. Three techniques including the dividing method using the median filter to estimate background, quotient based and homomorphic filtering were found as the effective illumination correction techniques based on a statistical evaluation. Applying the local contrast enhancement technique, such as CLAHE, for fundus images presented good potentials in enhancing the vasculature segmentation.

  2. Review of infrared scene projector technology-1993

    NASA Astrophysics Data System (ADS)

    Driggers, Ronald G.; Barnard, Kenneth J.; Burroughs, E. E.; Deep, Raymond G.; Williams, Owen M.

    1994-07-01

    The importance of testing IR imagers and missile seekers with realistic IR scenes warrants a review of the current technologies used in dynamic infrared scene projection. These technologies include resistive arrays, deformable mirror arrays, mirror membrane devices, liquid crystal light valves, laser writers, laser diode arrays, and CRTs. Other methods include frustrated total internal reflection, thermoelectric devices, galvanic cells, Bly cells, and vanadium dioxide. A description of each technology is presented along with a discussion of their relative benefits and disadvantages. The current state of each methodology is also summarized. Finally, the methods are compared and contrasted in terms of their performance parameters.

  3. Comparison of Analysis, Simulation, and Measurement of Wire-to-Wire Crosstalk. Part 2

    NASA Technical Reports Server (NTRS)

    Bradley, Arthur T.; Yavoich, Brian James; Hodson, Shane M.; Godley, Franklin

    2010-01-01

    In this investigation, we compare crosstalk analysis, simulation, and measurement results for electrically short configurations. Methods include hand calculations, PSPICE simulations, Microstripes transient field solver, and empirical measurement. In total, four representative physical configurations are examined, including a single wire over a ground plane, a twisted pair over a ground plane, generator plus receptor wires inside a cylindrical conduit, and a single receptor wire inside a cylindrical conduit. Part 1 addresses the first two cases, and Part 2 addresses the final two. Agreement between the analysis methods and test data is shown to be very good.

  4. Ensemble of trees approaches to risk adjustment for evaluating a hospital's performance.

    PubMed

    Liu, Yang; Traskin, Mikhail; Lorch, Scott A; George, Edward I; Small, Dylan

    2015-03-01

    A commonly used method for evaluating a hospital's performance on an outcome is to compare the hospital's observed outcome rate to the hospital's expected outcome rate given its patient (case) mix and service. The process of calculating the hospital's expected outcome rate given its patient mix and service is called risk adjustment (Iezzoni 1997). Risk adjustment is critical for accurately evaluating and comparing hospitals' performances since we would not want to unfairly penalize a hospital just because it treats sicker patients. The key to risk adjustment is accurately estimating the probability of an Outcome given patient characteristics. For cases with binary outcomes, the method that is commonly used in risk adjustment is logistic regression. In this paper, we consider ensemble of trees methods as alternatives for risk adjustment, including random forests and Bayesian additive regression trees (BART). Both random forests and BART are modern machine learning methods that have been shown recently to have excellent performance for prediction of outcomes in many settings. We apply these methods to carry out risk adjustment for the performance of neonatal intensive care units (NICU). We show that these ensemble of trees methods outperform logistic regression in predicting mortality among babies treated in NICU, and provide a superior method of risk adjustment compared to logistic regression.

  5. CompaRNA: a server for continuous benchmarking of automated methods for RNA secondary structure prediction

    PubMed Central

    Puton, Tomasz; Kozlowski, Lukasz P.; Rother, Kristian M.; Bujnicki, Janusz M.

    2013-01-01

    We present a continuous benchmarking approach for the assessment of RNA secondary structure prediction methods implemented in the CompaRNA web server. As of 3 October 2012, the performance of 28 single-sequence and 13 comparative methods has been evaluated on RNA sequences/structures released weekly by the Protein Data Bank. We also provide a static benchmark generated on RNA 2D structures derived from the RNAstrand database. Benchmarks on both data sets offer insight into the relative performance of RNA secondary structure prediction methods on RNAs of different size and with respect to different types of structure. According to our tests, on the average, the most accurate predictions obtained by a comparative approach are generated by CentroidAlifold, MXScarna, RNAalifold and TurboFold. On the average, the most accurate predictions obtained by single-sequence analyses are generated by CentroidFold, ContextFold and IPknot. The best comparative methods typically outperform the best single-sequence methods if an alignment of homologous RNA sequences is available. This article presents the results of our benchmarks as of 3 October 2012, whereas the rankings presented online are continuously updated. We will gladly include new prediction methods and new measures of accuracy in the new editions of CompaRNA benchmarks. PMID:23435231

  6. An analysis code for the Rapid Engineering Estimation of Momentum and Energy Losses (REMEL)

    NASA Technical Reports Server (NTRS)

    Dechant, Lawrence J.

    1994-01-01

    Nonideal behavior has traditionally been modeled by defining efficiency (a comparison between actual and isentropic processes), and subsequent specification by empirical or heuristic methods. With the increasing complexity of aeropropulsion system designs, the reliability of these more traditional methods is uncertain. Computational fluid dynamics (CFD) and experimental methods can provide this information but are expensive in terms of human resources, cost, and time. This report discusses an alternative to empirical and CFD methods by applying classical analytical techniques and a simplified flow model to provide rapid engineering estimates of these losses based on steady, quasi-one-dimensional governing equations including viscous and heat transfer terms (estimated by Reynold's analogy). A preliminary verification of REMEL has been compared with full Navier-Stokes (FNS) and CFD boundary layer computations for several high-speed inlet and forebody designs. Current methods compare quite well with more complex method results and solutions compare very well with simple degenerate and asymptotic results such as Fanno flow, isentropic variable area flow, and a newly developed, combined variable area duct with friction flow solution. These solution comparisons may offer an alternative to transitional and CFD-intense methods for the rapid estimation of viscous and heat transfer losses in aeropropulsion systems.

  7. Percutaneous endoscopic gastrostomy versus nasogastric tube feeding for patients with head and neck cancer: a systematic review

    PubMed Central

    Wang, Jinfeng; Liu, Minjie; Liu, Chao; Ye, Yun; Huang, Guanhong

    2014-01-01

    There are two main enteral feeding strategies—namely nasogastric (NG) tube feeding and percutaneous gastrostomy—used to improve the nutritional status of patients with head and neck cancer (HNC). But up till now there has been no consistent evidence about which method of enteral feeding is the optimal method for this patient group. To compare the effectiveness of percutaneous gastrostomy and NGT feeding in patients with HNC, relevant literature was identified through Medline, Embase, Pubmed, Cochrane, Wiley and manual searches. We included randomized controlled trials (RCTs) and non-experimental studies comparing percutaneous gastrostomy—including percutaneous endoscopic gastrostomy (PEG) and percutaneous fluoroscopic gastrostomy (PFG) —with NG for HNC patients. Data extraction recorded characteristics of intervention, type of study and factors that contributed to the methodological quality of the individual studies. Data were then compared with respect to nutritional status, duration of feeding, complications, radiotherapy delays, disease-free survival and overall survival. Methodological quality of RCTs and non-experimental studies were assessed with separate standard grading scales. It became apparent from our studies that both feeding strategies have advantages and disadvantages. PMID:24453356

  8. Percutaneous endoscopic gastrostomy versus nasogastric tube feeding for patients with head and neck cancer: a systematic review.

    PubMed

    Wang, Jinfeng; Liu, Minjie; Liu, Chao; Ye, Yun; Huang, Guanhong

    2014-05-01

    There are two main enteral feeding strategies-namely nasogastric (NG) tube feeding and percutaneous gastrostomy-used to improve the nutritional status of patients with head and neck cancer (HNC). But up till now there has been no consistent evidence about which method of enteral feeding is the optimal method for this patient group. To compare the effectiveness of percutaneous gastrostomy and NGT feeding in patients with HNC, relevant literature was identified through Medline, Embase, Pubmed, Cochrane, Wiley and manual searches. We included randomized controlled trials (RCTs) and non-experimental studies comparing percutaneous gastrostomy-including percutaneous endoscopic gastrostomy (PEG) and percutaneous fluoroscopic gastrostomy (PFG) -with NG for HNC patients. Data extraction recorded characteristics of intervention, type of study and factors that contributed to the methodological quality of the individual studies. Data were then compared with respect to nutritional status, duration of feeding, complications, radiotherapy delays, disease-free survival and overall survival. Methodological quality of RCTs and non-experimental studies were assessed with separate standard grading scales. It became apparent from our studies that both feeding strategies have advantages and disadvantages.

  9. Utility of a Fluorescence Microscopy Imaging System for Analyzing the DNA Ploidy of Pathological Megakaryocytes Including 5q- Syndrome.

    PubMed

    Nakahara, Takako; Suemori, Shinichiro; Tsujioka, Takayuki; Kataoka, Mikio; Kataoka, Hiromi; Shibakura, Misako; Tohyama, Kaoru

    2018-06-01

    To investigate megakaryocyte (MK) DNA ploidy in various hematological diseases, fluorescence microscopy imaging system (FMI) can be used to analyze DNA ploidy with cell morphology at the single-cell level by using specialized image-processing software. Here we compared DNA ploidy obtained by FMI measured with that obtained flow cytometry (FCM). With FMI, we could evaluate the DNA ploidy in long-term preserved bone marrow smear samples after staining. We next analyzed the MK DNA ploidy in 42 bone marrow smear samples including 26 myeloid neoplasm cases, and we compared the DNA ploidy and platelet counts in the patients' peripheral blood; the production of platelets was significantly high compared to DNA ploidy in the myeloproliferative neoplasms group. The FMI method revealed that the patients with 5q- syndrome exhibited relatively low DNA ploidy despite high platelet counts, and this result suggested that increased DNA ploidy is not indispensable to abundant platelet production. The FMI method for DNA ploidy will be a useful tool to clarify the relationship between DNA ploidy and platelet production by MKs.

  10. Beyond Blood Culture and Gram Stain Analysis: A Review of Molecular Techniques for the Early Detection of Bacteremia in Surgical Patients

    PubMed Central

    Kaplan, Heidi B.; Dua, Anahita; Litwin, Douglas B.; Ambrose, Catherine G.; Moore, Laura J.; Murray, COL Clinton K.; Wade, Charles E.; Holcomb, John B.

    2016-01-01

    Abstract Background: Sepsis from bacteremia occurs in 250,000 cases annually in the United States, has a mortality rate as high as 60%, and is associated with a poorer prognosis than localized infection. Because of these high figures, empiric antibiotic administration for patients with systemic inflammatory response syndrome (SIRS) and suspected infection is the second most common indication for antibiotic administration in intensive care units (ICU)s. However, overuse of empiric antibiotics contributes to the development of opportunistic infections, antibiotic resistance, and the increase in multi-drug-resistant bacterial strains. The current method of diagnosing and ruling out bacteremia is via blood culture (BC) and Gram stain (GS) analysis. Methods: Conventional and molecular methods for diagnosing bacteremia were reviewed and compared. The clinical implications, use, and current clinical trials of polymerase chain reaction (PCR)-based methods to detect bacterial pathogens in the blood stream were detailed. Results: BC/GS has several disadvantages. These include: some bacteria do not grow in culture media; others do not GS appropriately; and cultures can require up to 5 d to guide or discontinue antibiotic treatment. PCR-based methods can be potentially applied to detect rapidly, accurately, and directly microbes in human blood samples. Conclusions: Compared with the conventional BC/GS, particular advantages to molecular methods (specifically, PCR-based methods) include faster results, leading to possible improved antibiotic stewardship when bacteremia is not present. PMID:26918696

  11. Comparison of automated erythrocytapheresis versus manual exchange transfusion to treat cerebral macrovasculopathy in sickle cell anemia.

    PubMed

    Koehl, Bérengère; Sommet, Julie; Holvoet, Laurent; Abdoul, Hendy; Boizeau, Priscilla; Ithier, Ghislaine; Missud, Florence; Couque, Nathalie; Verlhac, Suzanne; Voultoury, Pauline; Sellami, Fatiha; Baruchel, André; Benkerrou, Malika

    2016-05-01

    Chronic exchange transfusion is effective for primary and secondary prevention of stroke in children with sickle cell anemia (SCA). Erythrocytapheresis is recognized to be the most efficient approach; however, it is not widely implemented and is not suitable for all patients. The aim of our study was to compare automated exchange transfusion (AET) with our manual method of exchange transfusion and, in particular, to evaluate the efficacy, safety, and cost of our manual method. Thirty-nine SCA children with stroke and/or abnormal findings on transcranial Doppler were included in the study. We retrospectively analyzed 1353 exchange sessions, including 333 sessions of AET and 1020 sessions of manual exchange transfusion (MET). Both methods were well tolerated. The median decrease in hemoglobin (Hb)S per session was 21.5% with AET and 18.8% with our manual method (p < 0.0001) with no major increase in red blood cell consumption. Iron overload was well controlled, even with the manual method, with a median (interquartile range) ferritin level of 312 (152-994) µg/L after 24 months of transfusions. The main differences in annual cost relate to equipment costs, which were 74 times higher with the automated method. Our study shows that continuous MET has comparable efficacy to the automated method in terms of stroke prevention, decrease in HbS, and iron overload prevention. It is feasible in all hospital settings and is often combined with AET successively over time. © 2016 AABB.

  12. Autoclave method for rapid preparation of bacterial PCR-template DNA.

    PubMed

    Simmon, Keith E; Steadman, Dewey D; Durkin, Sarah; Baldwin, Amy; Jeffrey, Wade H; Sheridan, Peter; Horton, Rene; Shields, Malcolm S

    2004-02-01

    An autoclave method for preparing bacterial DNA for PCR template is presented, it eliminates the use of detergents, organic solvents, and mechanical cellular disruption approaches, thereby significantly reducing processing time and costs while increasing reproducibility. Bacteria are lysed by rapid heating and depressurization in an autoclave. The lysate, cleared by microcentrifugation, was either used directly in the PCR reaction, or concentrated by ultrafiltration. This approach was compared with seven established methods of DNA template preparation from four bacterial sources which included boiling Triton X-100 and SDS, bead beating, lysozyme/proteinase K, and CTAB lysis method components. Bacteria examined were Enterococcus and Escherichia coli, a natural marine bacterial community and an Antarctic cyanobacterial-mat. DNAs were tested for their suitability as PCR templates by repetitive element random amplified polymorphic DNA (RAPD) and denaturing gradient gel electrophoresis (DGGE) analysis. The autoclave method produced PCR amplifiable template comparable or superior to the other methods, with greater reproducibility, much shorter processing time, and at a significantly lower cost.

  13. Comparison of two on-orbit attitude sensor alignment methods

    NASA Technical Reports Server (NTRS)

    Krack, Kenneth; Lambertson, Michael; Markley, F. Landis

    1990-01-01

    Compared here are two methods of on-orbit alignment of vector attitude sensors. The first method uses the angular difference between simultaneous measurements from two or more sensors. These angles are compared to the angular differences between the respective reference positions of the sensed objects. The alignments of the sensors are adjusted to minimize the difference between the two sets of angles. In the second method, the sensor alignment is part of a state vector that includes the attitude. The alignments are adjusted along with the attitude to minimize all observation residuals. It is shown that the latter method can result in much less alignment uncertainty when gyroscopes are used for attitude propagation during the alignment estimation. The additional information for this increased accuracy comes from knowledge of relative attitude obtained from the spacecraft gyroscopes. The theoretical calculations of this difference in accuracy are presented. Also presented are numerical estimates of the alignment uncertainties of the fixed-head star trackers on the Extreme Ultraviolet Explorer spacecraft using both methods.

  14. A Hybrid On-line Verification Method of Relay Setting

    NASA Astrophysics Data System (ADS)

    Gao, Wangyuan; Chen, Qing; Si, Ji; Huang, Xin

    2017-05-01

    Along with the rapid development of the power industry, grid structure gets more sophisticated. The validity and rationality of protective relaying are vital to the security of power systems. To increase the security of power systems, it is essential to verify the setting values of relays online. Traditional verification methods mainly include the comparison of protection range and the comparison of calculated setting value. To realize on-line verification, the verifying speed is the key. The verifying result of comparing protection range is accurate, but the computation burden is heavy, and the verifying speed is slow. Comparing calculated setting value is much faster, but the verifying result is conservative and inaccurate. Taking the overcurrent protection as example, this paper analyses the advantages and disadvantages of the two traditional methods above, and proposes a hybrid method of on-line verification which synthesizes the advantages of the two traditional methods. This hybrid method can meet the requirements of accurate on-line verification.

  15. Modified Drop Tower Impact Tests for American Football Helmets.

    PubMed

    Rush, G Alston; Prabhu, R; Rush, Gus A; Williams, Lakiesha N; Horstemeyer, M F

    2017-02-19

    A modified National Operating Committee on Standards for Athletic Equipment (NOCSAE) test method for American football helmet drop impact test standards is presented that would provide better assessment of a helmet's on-field impact performance by including a faceguard on the helmet. In this study, a merger of faceguard and helmet test standards is proposed. The need for a more robust systematic approach to football helmet testing procedures is emphasized by comparing representative results of the Head Injury Criterion (HIC), Severity Index (SI), and peak acceleration values for different helmets at different helmet locations under modified NOCSAE standard drop tower tests. Essentially, these comparative drop test results revealed that the faceguard adds a stiffening kinematic constraint to the shell that lessens total energy absorption. The current NOCSAE standard test methods can be improved to represent on-field helmet hits by attaching the faceguards to helmets and by including two new helmet impact locations (Front Top and Front Top Boss). The reported football helmet test method gives a more accurate representation of a helmet's performance and its ability to mitigate on-field impacts while promoting safer football helmets.

  16. MOCASSIN-prot: a multi-objective clustering approach for protein similarity networks.

    PubMed

    Keel, Brittney N; Deng, Bo; Moriyama, Etsuko N

    2018-04-15

    Proteins often include multiple conserved domains. Various evolutionary events including duplication and loss of domains, domain shuffling, as well as sequence divergence contribute to generating complexities in protein structures, and consequently, in their functions. The evolutionary history of proteins is hence best modeled through networks that incorporate information both from the sequence divergence and the domain content. Here, a game-theoretic approach proposed for protein network construction is adapted into the framework of multi-objective optimization, and extended to incorporate clustering refinement procedure. The new method, MOCASSIN-prot, was applied to cluster multi-domain proteins from ten genomes. The performance of MOCASSIN-prot was compared against two protein clustering methods, Markov clustering (TRIBE-MCL) and spectral clustering (SCPS). We showed that compared to these two methods, MOCASSIN-prot, which uses both domain composition and quantitative sequence similarity information, generates fewer false positives. It achieves more functionally coherent protein clusters and better differentiates protein families. MOCASSIN-prot, implemented in Perl and Matlab, is freely available at http://bioinfolab.unl.edu/emlab/MOCASSINprot. emoriyama2@unl.edu. Supplementary data are available at Bioinformatics online.

  17. General expressions for downlink signal to interference and noise ratio in homogeneous and heterogeneous LTE-Advanced networks.

    PubMed

    Ali, Nora A; Mourad, Hebat-Allah M; ElSayed, Hany M; El-Soudani, Magdy; Amer, Hassanein H; Daoud, Ramez M

    2016-11-01

    The interference is the most important problem in LTE or LTE-Advanced networks. In this paper, the interference was investigated in terms of the downlink signal to interference and noise ratio (SINR). In order to compare the different frequency reuse methods that were developed to enhance the SINR, it would be helpful to have a generalized expression to study the performance of the different methods. Therefore, this paper introduces general expressions for the SINR in homogeneous and in heterogeneous networks. In homogeneous networks, the expression was applied for the most common types of frequency reuse techniques: soft frequency reuse (SFR) and fractional frequency reuse (FFR). The expression was examined by comparing it with previously developed ones in the literature and the comparison showed that the expression is valid for any type of frequency reuse scheme and any network topology. Furthermore, the expression was extended to include the heterogeneous network; the expression includes the problem of co-tier and cross-tier interference in heterogeneous networks (HetNet) and it was examined by the same method of the homogeneous one.

  18. Realistic inversion of diffraction data for an amorphous solid: The case of amorphous silicon

    NASA Astrophysics Data System (ADS)

    Pandey, Anup; Biswas, Parthapratim; Bhattarai, Bishal; Drabold, D. A.

    2016-12-01

    We apply a method called "force-enhanced atomic refinement" (FEAR) to create a computer model of amorphous silicon (a -Si) based upon the highly precise x-ray diffraction experiments of Laaziri et al. [Phys. Rev. Lett. 82, 3460 (1999), 10.1103/PhysRevLett.82.3460]. The logic underlying our calculation is to estimate the structure of a real sample a -Si using experimental data and chemical information included in a nonbiased way, starting from random coordinates. The model is in close agreement with experiment and also sits at a suitable energy minimum according to density-functional calculations. In agreement with experiments, we find a small concentration of coordination defects that we discuss, including their electronic consequences. The gap states in the FEAR model are delocalized compared to a continuous random network model. The method is more efficient and accurate, in the sense of fitting the diffraction data, than conventional melt-quench methods. We compute the vibrational density of states and the specific heat, and we find that both compare favorably to experiments.

  19. Determination of N epsilon-(carboxymethyl)lysine in foods and related systems.

    PubMed

    Ames, Jennifer M

    2008-04-01

    The sensitive and specific determination of advanced glycation end products (AGEs) is of considerable interest because these compounds have been associated with pro-oxidative and proinflammatory effects in vivo. AGEs form when carbonyl compounds, such as glucose and its oxidation products, glyoxal and methylglyoxal, react with the epsilon-amino group of lysine and the guanidino group of arginine to give structures including N epsilon-(carboxymethyl)lysine (CML), N epsilon-(carboxyethyl)lysine, and hydroimidazolones. CML is frequently used as a marker for AGEs in general. It exists in both the free or peptide-bound forms. Analysis of CML involves its extraction from the food (including protein hydrolysis to release any peptide-bound adduct) and determination by immunochemical or instrumental means. Various factors must be considered at each step of the analysis. Extraction, hydrolysis, and sample clean-up are all less straight forward for food samples, compared to plasma and tissue. The immunochemical and instrumental methods all have their advantages and disadvantages, and no perfect method exists. Currently, different procedures are being used in different laboratories, and there is an urgent need to compare, improve, and validate methods.

  20. Chemical components and tyrosinase inhibitors from the twigs of Artocarpus heterophyllus.

    PubMed

    Zheng, Zong-Ping; Chen, Sibao; Wang, Shiyun; Wang, Xia-Chang; Cheng, Ka-Wing; Wu, Jia-Jun; Yang, Dajiang; Wang, Mingfu

    2009-08-12

    An HPLC method was developed and validated to compare the chemical profiles and tyrosinase inhibitors in the woods, twigs, roots, and leaves of Artocarpus heterophyllus . Five active tyrosinase inhibitors including dihydromorin, steppogenin, norartocarpetin, artocarpanone, and artocarpesin were used as marker compounds in this HPLC method. It was discovered that the chemical profiles of A. heterophyllus twigs and woods are quite different. Systematic chromatographic methods were further applied to purify the chemicals in the twigs of A. heterophyllus. Four new phenolic compounds, including one isoprenylated 2-arylbenzofuran derivative, artoheterophyllin A (1), and three isoprenylated flavonoids, artoheterophyllin B (2), artoheterophyllin C (3), and artoheterophyllin D (4), together with 16 known compounds, were isolated from the ethanol extract of the twigs of A. heterophyllus. The structures of compounds 1-4 were elucidated by spectroscopic analysis. However, the four new compounds did not show significant inhibitory activities against mushroom tyrosinase compared to kojic acid. It was found that similar compounds, such as norartocarpetin and artocarpesin in the twigs and woods of A. heterophyllus, contributed to their tyrosinase inhibitory activity.

  1. Analysis of Waves in Space Plasma (WISP) near field simulation and experiment

    NASA Technical Reports Server (NTRS)

    Richie, James E.

    1992-01-01

    The WISP payload scheduler for a 1995 space transportation system (shuttle flight) will include a large power transmitter on board at a wide range of frequencies. The levels of electromagnetic interference/electromagnetic compatibility (EMI/EMC) must be addressed to insure the safety of the shuttle crew. This report is concerned with the simulation and experimental verification of EMI/EMC for the WISP payload in the shuttle cargo bay. The simulations have been carried out using the method of moments for both thin wires and patches to stimulate closed solids. Data obtained from simulation is compared with experimental results. An investigation of the accuracy of the modeling approach is also included. The report begins with a description of the WISP experiment. A description of the model used to simulate the cargo bay follows. The results of the simulation are compared to experimental data on the input impedance of the WISP antenna with the cargo bay present. A discussion of the methods used to verify the accuracy of the model is shown to illustrate appropriate methods for obtaining this information. Finally, suggestions for future work are provided.

  2. Quantifying the quality of medical x-ray images: An evaluation based on normal anatomy for lumbar spine and chest radiography

    NASA Astrophysics Data System (ADS)

    Tingberg, Anders Martin

    Optimisation in diagnostic radiology requires accurate methods for determination of patient absorbed dose and clinical image quality. Simple methods for evaluation of clinical image quality are at present scarce and this project aims at developing such methods. Two methods are used and further developed; fulfillment of image criteria (IC) and visual grading analysis (VGA). Clinical image quality descriptors are defined based on these two methods: image criteria score (ICS) and visual grading analysis score (VGAS), respectively. For both methods the basis is the Image Criteria of the ``European Guidelines on Quality Criteria for Diagnostic Radiographic Images''. Both methods have proved to be useful for evaluation of clinical image quality. The two methods complement each other: IC is an absolute method, which means that the quality of images of different patients and produced with different radiographic techniques can be compared with each other. The separating power of IC is, however, weaker than that of VGA. VGA is the best method for comparing images produced with different radiographic techniques and has strong separating power, but the results are relative, since the quality of an image is compared to the quality of a reference image. The usefulness of the two methods has been verified by comparing the results from both of them with results from a generally accepted method for evaluation of clinical image quality, receiver operating characteristics (ROC). The results of the comparison between the two methods based on visibility of anatomical structures and the method based on detection of pathological structures (free-response forced error) indicate that the former two methods can be used for evaluation of clinical image quality as efficiently as the method based on ROC. More studies are, however, needed for us to be able to draw a general conclusion, including studies of other organs, using other radiographic techniques, etc. The results of the experimental evaluation of clinical image quality are compared with physical quantities calculated with a theoretical model based on a voxel phantom, and correlations are found. The results demonstrate that the computer model can be a useful toot in planning further experimental studies.

  3. System for detecting operating errors in a variable valve timing engine using pressure sensors

    DOEpatents

    Wiles, Matthew A.; Marriot, Craig D

    2013-07-02

    A method and control module includes a pressure sensor data comparison module that compares measured pressure volume signal segments to ideal pressure volume segments. A valve actuation hardware remedy module performs a hardware remedy in response to comparing the measured pressure volume signal segments to the ideal pressure volume segments when a valve actuation hardware failure is detected.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lentine, Anthony L.; Cox, Jonathan Albert

    Methods and systems for stabilizing a resonant modulator include receiving pre-modulation and post-modulation portions of a carrier signal, determining the average power from these portions, comparing an average input power to the average output power, and operating a heater coupled to the modulator based on the comparison. One system includes a pair of input structures, one or more processing elements, a comparator, and a control element. The input structures are configured to extract pre-modulation and post-modulation portions of a carrier signal. The processing elements are configured to determine average powers from the extracted portions. The comparator is configured to comparemore » the average input power and the average output power. The control element operates a heater coupled to the modulator based on the comparison.« less

  5. Comparative Validation of the Determination of Sofosbuvir in Pharmaceuticals by Several Inexpensive Ecofriendly Chromatographic, Electrophoretic, and Spectrophotometric Methods.

    PubMed

    El-Yazbi, Amira F

    2017-01-20

    Sofosbuvir (SOFO) was approved by the U.S. Food and Drug Administration in 2013 for the treatment of hepatitis C virusinfection with enhanced antiviral potency compared with earlier analogs. Notwithstanding, all current editions of the pharmacopeias still do not present any analytical methods for the quantification of SOFO. Thus, rapid, simple, and ecofriendly methods for the routine analysis of commercial formulations of SOFO are desirable. In this study, five accurate methods for the determination of SOFO in pharmaceutical tablets were developed and validated. These methods include HPLC, capillary zone electrophoresis, HPTLC, and UV spectrophotometric and derivative spectrometry methods. The proposed methods proved to be rapid, simple, sensitive, selective, and accurate analytical procedures that were suitable for the reliable determination of SOFO in pharmaceutical tablets. An analysis of variance test with <em>P</em>-value &#x003E; 0.05 confirmed that there were no significant differences between the proposed assays. Thus, any of these methods can be used for the routine analysis of SOFO in commercial tablets.

  6. Method and apparatus for displaying information

    NASA Technical Reports Server (NTRS)

    Huang, Sui (Inventor); Eichler, Gabriel (Inventor); Ingber, Donald E. (Inventor)

    2010-01-01

    A method for displaying large amounts of information. The method includes the steps of forming a spatial layout of tiles each corresponding to a representative reference element; mapping observed elements onto the spatial layout of tiles of representative reference elements; assigning a respective value to each respective tile of the spatial layout of the representative elements; and displaying an image of the spatial layout of tiles of representative elements. Each tile includes atomic attributes of representative elements. The invention also relates to an apparatus for displaying large amounts of information. The apparatus includes a tiler forming a spatial layout of tiles, each corresponding to a representative reference element; a comparator mapping observed elements onto said spatial layout of tiles of representative reference elements; an assigner assigning a respective value to each respective tile of said spatial layout of representative reference elements; and a display displaying an image of the spatial layout of tiles of representative reference elements.

  7. Cooling method with automated seasonal freeze protection

    DOEpatents

    Cambell, Levi; Chu, Richard; David, Milnes; Ellsworth, Jr, Michael; Iyengar, Madhusudan; Simons, Robert; Singh, Prabjit; Zhang, Jing

    2016-05-31

    An automated multi-fluid cooling method is provided for cooling an electronic component(s). The method includes obtaining a coolant loop, and providing a coolant tank, multiple valves, and a controller. The coolant loop is at least partially exposed to outdoor ambient air temperature(s) during normal operation, and the coolant tank includes first and second reservoirs containing first and second fluids, respectively. The first fluid freezes at a lower temperature than the second, the second fluid has superior cooling properties compared with the first, and the two fluids are soluble. The multiple valves are controllable to selectively couple the first or second fluid into the coolant in the coolant loop, wherein the coolant includes at least the second fluid. The controller automatically controls the valves to vary first fluid concentration level in the coolant loop based on historical, current, or anticipated outdoor air ambient temperature(s) for a time of year.

  8. Quantitative determination of atmospheric hydroperoxyl radical

    DOEpatents

    Springston, Stephen R.; Lloyd, Judith; Zheng, Jun

    2007-10-23

    A method for the quantitative determination of atmospheric hydroperoxyl radical comprising: (a) contacting a liquid phase atmospheric sample with a chemiluminescent compound which luminesces on contact with hydroperoxyl radical; (b) determining luminescence intensity from the liquid phase atmospheric sample; and (c) comparing said luminescence intensity from the liquid phase atmospheric sample to a standard luminescence intensity for hydroperoxyl radical. An apparatus for automating the method is also included.

  9. Value Tendency Differences between Pre-Service Social Studies Teachers within the Scope of the East and the West

    ERIC Educational Resources Information Center

    Osmanoglu, Ahmed Emin

    2017-01-01

    This study aims to comparatively examine the values that the students of the Department of Social Studies in Education Faculty at two universities located in the Eastern and Western parts of Turkey desire to find in people they interact with. Multiple methods, including quantitative and qualitative methods, were used in this study. The research…

  10. Transient multivariable sensor evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vilim, Richard B.; Heifetz, Alexander

    A method and system for performing transient multivariable sensor evaluation. The method and system includes a computer system for identifying a model form, providing training measurement data, generating a basis vector, monitoring system data from sensor, loading the system data in a non-transient memory, performing an estimation to provide desired data and comparing the system data to the desired data and outputting an alarm for a defective sensor.

  11. Impact of crisis resource management simulation-based training for interprofessional and interdisciplinary teams: A systematic review.

    PubMed

    Fung, Lillia; Boet, Sylvain; Bould, M Dylan; Qosa, Haytham; Perrier, Laure; Tricco, Andrea; Tavares, Walter; Reeves, Scott

    2015-01-01

    Crisis resource management (CRM) abilities are important for different healthcare providers to effectively manage critical clinical events. This study aims to review the effectiveness of simulation-based CRM training for interprofessional and interdisciplinary teams compared to other instructional methods (e.g., didactics). Interprofessional teams are composed of several professions (e.g., nurse, physician, midwife) while interdisciplinary teams are composed of several disciplines from the same profession (e.g., cardiologist, anaesthesiologist, orthopaedist). Medline, EMBASE, CINAHL, Cochrane Central Register of Controlled Trials, and ERIC were searched using terms related to CRM, crisis management, crew resource management, teamwork, and simulation. Trials comparing simulation-based CRM team training versus any other methods of education were included. The educational interventions involved interprofessional or interdisciplinary healthcare teams. The initial search identified 7456 publications; 12 studies were included. Simulation-based CRM team training was associated with significant improvements in CRM skill acquisition in all but two studies when compared to didactic case-based CRM training or simulation without CRM training. Of the 12 included studies, one showed significant improvements in team behaviours in the workplace, while two studies demonstrated sustained reductions in adverse patient outcomes after a single simulation-based CRM team intervention. In conclusion, CRM simulation-based training for interprofessional and interdisciplinary teams show promise in teaching CRM in the simulator when compared to didactic case-based CRM education or simulation without CRM teaching. More research, however, is required to demonstrate transfer of learning to workplaces and potential impact on patient outcomes.

  12. Culture methods impact recovery of antibiotic-resistant Enterococci including Enterococcus cecorum from pre- and postharvest chicken.

    PubMed

    Suyemoto, M M; Barnes, H J; Borst, L B

    2017-03-01

    Pathogenic strains of Enterococcus cecorum (EC) expressing multidrug resistance have emerged. In National Antimicrobial Resistance Monitoring System (NARMS) data, EC is rarely recovered from chickens. Two NARMS methodologies (FDA and USDA) were compared with standard culture (SC) techniques for recovery of EC. NARMS methods failed to detect EC in 58 caecal samples, 20 chicken breast or six whole broiler samples. EC was recovered from 1 of 38 (2·6%) and 2 of 38 (5·2%) preharvest spinal lesions (USDA and FDA method, respectively). In contrast, using the SC method, EC was recovered from 44 of 53 (83%) caecal samples, all 38 (100%) spinal lesions, 14 of 20 (70%) chicken breast samples, and all three spinal lesions identified in whole carcasses. Compared with other Enterococcus spp., EC isolates had a higher prevalence of resistance to macrolides. The NARMS methods significantly affected recovery of enterococcal species other than EC. When the postharvest FDA method was applied to preharvest caecal samples, isolates of Enterococcus faecium were preferentially recovered. All 11 E. faecium isolates were multidrug resistant, including resistance to penicillin, daptomycin and linezolid. These findings confirm that current methodologies may not accurately identify the amount and range of antimicrobial resistance of enterococci from chicken sources. Enterococci are an important reservoir for antimicrobial resistance. This study demonstrates how current culture methods underreport resistance to macrolides in enterococci by selecting against strains of Enterococcus cecorum in pre- and postharvest chicken. Further, the application of postharvest surveillance methods to preharvest samples resulted in selective recovery of Enterococcus faecium over Enterococcus faecalis. Isolates of E. faecium recovered exhibited multidrug resistance including penicillin, daptomycin and linezolid resistance. These findings suggest that culture methodology significantly impacts the range and amount of antimicrobial resistance detected in enterococci isolated from chicken. © 2016 The Society for Applied Microbiology.

  13. Assessing alcohol intake & its dose-dependent effects on liver enzymes by 24-h recall and questionnaire using NHANES 2001-2010 data

    DOE PAGES

    Agarwal, Sanjiv; Fulgoni, III, Victor L.; Lieberman, Harris R.

    2016-06-22

    Alcohol is a significant component of the diet with dose-dependent risks and benefits. High doses of alcohol damage the liver and early symptoms of liver disease include changes in routinely assessed liver enzymes. Less is known regarding the mechanisms responsible for the benefits of moderate alcohol consumption, including their effects on the liver. The objectives of this study were to examine alcohol’s dose-dependent effects on markers of liver function (alkaline phosphatase (ALP), alanine aminotransferase (ALT), aspartate aminotransferase (AST), gamma glutamyl transferase (GGT), and bilirubin), as well as to compare the different methods of assessing alcohol intake using NHANES 2001–2010 adultmore » data (N =24,807). Three methods were used to estimate alcohol intake from all volunteers: 24-h recall; the National Cancer Institute (NCI) method of usual intake; and a specific alcohol intake questionnaire. Mean alcohol intake by 24-h recall, NCI method and questionnaire was 41.0 ± 0.8 g/d, 10.9 ± 0.2 g/d and 11.0 ± 0.2 g/d, respectively. Alcohol consumers had significantly lower levels of ALP and higher levels of AST, GGT and bilirubin compared to non-consumers (P < 0.01) and activities of ALT, AST, and GGT increased and of ALP decreased as alcohol intake increased, regardless of intake assessment method used. The most sensitive measure of alcohol consumption was GGT. Since alcohol had a graded linear effect on several liver enzymes, including at low and moderate doses, benefits as well as risks of alcohol intake may be related to liver function. In conclusion, since the NCI method and alcohol questionnaire yielded very similar alcohol intake estimates, this study cross-validated these methods and demonstrated the robustness of the NCI method for estimating intake of irregularly consumed foods.« less

  14. Assessing alcohol intake & its dose-dependent effects on liver enzymes by 24-h recall and questionnaire using NHANES 2001-2010 data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Sanjiv; Fulgoni, III, Victor L.; Lieberman, Harris R.

    Alcohol is a significant component of the diet with dose-dependent risks and benefits. High doses of alcohol damage the liver and early symptoms of liver disease include changes in routinely assessed liver enzymes. Less is known regarding the mechanisms responsible for the benefits of moderate alcohol consumption, including their effects on the liver. The objectives of this study were to examine alcohol’s dose-dependent effects on markers of liver function (alkaline phosphatase (ALP), alanine aminotransferase (ALT), aspartate aminotransferase (AST), gamma glutamyl transferase (GGT), and bilirubin), as well as to compare the different methods of assessing alcohol intake using NHANES 2001–2010 adultmore » data (N =24,807). Three methods were used to estimate alcohol intake from all volunteers: 24-h recall; the National Cancer Institute (NCI) method of usual intake; and a specific alcohol intake questionnaire. Mean alcohol intake by 24-h recall, NCI method and questionnaire was 41.0 ± 0.8 g/d, 10.9 ± 0.2 g/d and 11.0 ± 0.2 g/d, respectively. Alcohol consumers had significantly lower levels of ALP and higher levels of AST, GGT and bilirubin compared to non-consumers (P < 0.01) and activities of ALT, AST, and GGT increased and of ALP decreased as alcohol intake increased, regardless of intake assessment method used. The most sensitive measure of alcohol consumption was GGT. Since alcohol had a graded linear effect on several liver enzymes, including at low and moderate doses, benefits as well as risks of alcohol intake may be related to liver function. In conclusion, since the NCI method and alcohol questionnaire yielded very similar alcohol intake estimates, this study cross-validated these methods and demonstrated the robustness of the NCI method for estimating intake of irregularly consumed foods.« less

  15. Summary of tracking and identification methods

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Yang, Chun; Kadar, Ivan

    2014-06-01

    Over the last two decades, many solutions have arisen to combine target tracking estimation with classification methods. Target tracking includes developments from linear to non-linear and Gaussian to non-Gaussian processing. Pattern recognition includes detection, classification, recognition, and identification methods. Integrating tracking and pattern recognition has resulted in numerous approaches and this paper seeks to organize the various approaches. We discuss the terminology so as to have a common framework for various standards such as the NATO STANAG 4162 - Identification Data Combining Process. In a use case, we provide a comparative example highlighting that location information (as an example) with additional mission objectives from geographical, human, social, cultural, and behavioral modeling is needed to determine identification as classification alone does not allow determining identification or intent.

  16. A Method of Dynamic Extended Reactive Power Optimization in Distribution Network Containing Photovoltaic-Storage System

    NASA Astrophysics Data System (ADS)

    Wang, Wu; Huang, Wei; Zhang, Yongjun

    2018-03-01

    The grid-integration of Photovoltaic-Storage System brings some undefined factors to the network. In order to make full use of the adjusting ability of Photovoltaic-Storage System (PSS), this paper puts forward a reactive power optimization model, which are used to construct the objective function based on power loss and the device adjusting cost, including energy storage adjusting cost. By using Cataclysmic Genetic Algorithm to solve this optimization problem, and comparing with other optimization method, the result proved that: the method of dynamic extended reactive power optimization this article puts forward, can enhance the effect of reactive power optimization, including reducing power loss and device adjusting cost, meanwhile, it gives consideration to the safety of voltage.

  17. Evaluation of Carbon Anodes for Rechargeable Lithium Cells

    NASA Technical Reports Server (NTRS)

    Huang, C-K.; Surampudi, S.; Attia, A.; Halpert, G.

    1993-01-01

    Both liquid phase intercalation technique and electrochemical intercalation technique were examined for the Li-carbon material preparation. The electrochemical techniques include a intermittent discharge method and a two step method. These two electrochemical techniques can ensure to achieve the maximum reversible Li capacity for common commercially available carbon materials. The carbon materials evaluated by the intercalacation method includes: pitch coke, petroleum cole, PAN fiber and graphite materials. Their reversible Li capacity were determined and compared. In this paper, we also demonstrate the importance of EPDM binder composition in the carbon electrode. Our results indicated that it can impact the Li intercalation and de-intercalation capacity in carbon materials. Finally, two possibilities that may help explain the capacity degradation during practical cell cycling were proposed.

  18. Resonant frequency method for bearing ball inspection

    DOEpatents

    Khuri-Yakub, B. T.; Hsieh, Chung-Kao

    1993-01-01

    The present invention provides for an inspection system and method for detecting defects in test objects which includes means for generating expansion inducing energy focused upon the test object at a first location, such expansion being allowed to contract, thereby causing pressure wave within and on the surface of the test object. Such expansion inducing energy may be provided by, for example, a laser beam or ultrasonic energy. At a second location, the amplitudes and phases of the acoustic waves are detected and the resonant frequencies' quality factors are calculated and compared to predetermined quality factor data, such comparison providing information of whether the test object contains a defect. The inspection system and method also includes means for mounting the bearing ball for inspection.

  19. Resonant frequency method for bearing ball inspection

    DOEpatents

    Khuri-Yakub, B.T.; Chungkao Hsieh.

    1993-11-02

    The present invention provides for an inspection system and method for detecting defects in test objects which includes means for generating expansion inducing energy focused upon the test object at a first location, such expansion being allowed to contract, thereby causing pressure wave within and on the surface of the test object. Such expansion inducing energy may be provided by, for example, a laser beam or ultrasonic energy. At a second location, the amplitudes and phases of the acoustic waves are detected and the resonant frequencies' quality factors are calculated and compared to predetermined quality factor data, such comparison providing information of whether the test object contains a defect. The inspection system and method also includes means for mounting the bearing ball for inspection. 5 figures.

  20. A method for including external feed in depletion calculations with CRAM and implementation into ORIGEN

    DOE PAGES

    Isotalo, Aarno E.; Wieselquist, William A.

    2015-05-15

    A method for including external feed with polynomial time dependence in depletion calculations with the Chebyshev Rational Approximation Method (CRAM) is presented and the implementation of CRAM to the ORIGEN module of the SCALE suite is described. In addition to being able to handle time-dependent feed rates, the new solver also adds the capability to perform adjoint calculations. Results obtained with the new CRAM solver and the original depletion solver of ORIGEN are compared to high precision reference calculations, which shows the new solver to be orders of magnitude more accurate. Lastly, in most cases, the new solver is upmore » to several times faster due to not requiring similar substepping as the original one.« less

  1. Time-dependent solution for axisymmetric flow over a blunt body with ideal gas, CF4, or equilibrium air chemistry

    NASA Technical Reports Server (NTRS)

    Hamilton, H. H., II; Spall, J. R.

    1986-01-01

    A time-asymptotic method has been used to obtain steady-flow solutions for axisymmetric inviscid flow over several blunt bodies including spheres, paraboloids, ellipsoids, and spherically blunted cones. Comparisons with experimental data and results of other computational methods have demonstrated that accurate solutions can be obtained using this approach. The method should prove useful as an analysis tool for comparing with experimental data and for making engineering calculations for blunt reentry vehicles.

  2. Time-dependent solution for axisymmetric flow over a blunt body with ideal gas, CF4, or equilibrium air chemistry

    NASA Astrophysics Data System (ADS)

    Hamilton, H. H., II; Spall, J. R.

    1986-07-01

    A time-asymptotic method has been used to obtain steady-flow solutions for axisymmetric inviscid flow over several blunt bodies including spheres, paraboloids, ellipsoids, and spherically blunted cones. Comparisons with experimental data and results of other computational methods have demonstrated that accurate solutions can be obtained using this approach. The method should prove useful as an analysis tool for comparing with experimental data and for making engineering calculations for blunt reentry vehicles.

  3. Numerical approach for ECT by using boundary element method with Laplace transform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Enokizono, M.; Todaka, T.; Shibao, K.

    1997-03-01

    This paper presents an inverse analysis by using BEM with Laplace transform. The method is applied to a simple problem in the eddy current testing (ECT). Some crack shapes in a conductive specimen are estimated from distributions of the transient eddy current on its sensing surface and magnetic flux density in the liftoff space. Because the transient behavior includes information on various frequency components, the method is applicable to the shape estimation of a comparative small crack.

  4. Analysis of drift correction in different simulated weighing schemes

    NASA Astrophysics Data System (ADS)

    Beatrici, A.; Rebelo, A.; Quintão, D.; Cacais, F. L.; Loayza, V. M.

    2015-10-01

    In the calibration of high accuracy mass standards some weighing schemes are used to reduce or eliminate the zero drift effects in mass comparators. There are different sources for the drift and different methods for its treatment. By using numerical methods, drift functions were simulated and a random term was included in each function. The comparison between the results obtained from ABABAB and ABBA weighing series was carried out. The results show a better efficacy of ABABAB method for drift with smooth variation and small randomness.

  5. A Gaussian Approximation Potential for Silicon

    NASA Astrophysics Data System (ADS)

    Bernstein, Noam; Bartók, Albert; Kermode, James; Csányi, Gábor

    We present an interatomic potential for silicon using the Gaussian Approximation Potential (GAP) approach, which uses the Gaussian process regression method to approximate the reference potential energy surface as a sum of atomic energies. Each atomic energy is approximated as a function of the local environment around the atom, which is described with the smooth overlap of atomic environments (SOAP) descriptor. The potential is fit to a database of energies, forces, and stresses calculated using density functional theory (DFT) on a wide range of configurations from zero and finite temperature simulations. These include crystalline phases, liquid, amorphous, and low coordination structures, and diamond-structure point defects, dislocations, surfaces, and cracks. We compare the results of the potential to DFT calculations, as well as to previously published models including Stillinger-Weber, Tersoff, modified embedded atom method (MEAM), and ReaxFF. We show that it is very accurate as compared to the DFT reference results for a wide range of properties, including low energy bulk phases, liquid structure, as well as point, line, and plane defects in the diamond structure.

  6. Quantitation of TGF-beta1 mRNA in porcine mesangial cells by comparative kinetic RT/PCR: comparison with ribonuclease protection assay and in situ hybridization.

    PubMed

    Ceol, M; Forino, M; Gambaro, G; Sauer, U; Schleicher, E D; D'Angelo, A; Anglani, F

    2001-01-01

    Gene expression can be examined with different techniques including ribonuclease protection assay (RPA), in situ hybridisation (ISH), and quantitative reverse transcription-polymerase chain reaction (RT/PCR). These methods differ considerably in their sensitivity and precision in detecting and quantifying low abundance mRNA. Although there is evidence that RT/PCR can be performed in a quantitative manner, the quantitative capacity of this method is generally underestimated. To demonstrate that the comparative kinetic RT/PCR strategy-which uses a housekeeping gene as internal standard-is a quantitative method to detect significant differences in mRNA levels between different samples, the inhibitory effect of heparin on phorbol 12-myristate 13-acetate (PMA)-induced-TGF-beta1 mRNA expression was evaluated by RT/PCR and RPA, the standard method of mRNA quantification, and the results were compared. The reproducibility of RT/PCR amplification was calculated by comparing the quantity of G3PDH and TGF-beta1 PCR products, generated during the exponential phases, estimated from two different RT/PCR (G3PDH, r = 0.968, P = 0.0000; TGF-beta1, r = 0.966, P = 0.0000). The quantitative capacity of comparative kinetic RT/PCR was demonstrated by comparing the results obtained from RPA and RT/PCR using linear regression analysis. Starting from the same RNA extraction, but using only 1% of the RNA for the RT/PCR compared to RPA, significant correlation was observed (r = 0.984, P = 0.0004). Moreover the morphometric analysis of ISH signal was applied for the semi-quantitative evaluation of the expression and localisation of TGF-beta1 mRNA in the entire cell population. Our results demonstrate the close similarity of the RT/PCR and RPA methods in giving quantitative information on mRNA expression and indicate the possibility to adopt the comparative kinetic RT/PCR as reliable quantitative method of mRNA analysis. Copyright 2001 Wiley-Liss, Inc.

  7. GenomeFingerprinter: the genome fingerprint and the universal genome fingerprint analysis for systematic comparative genomics.

    PubMed

    Ai, Yuncan; Ai, Hannan; Meng, Fanmei; Zhao, Lei

    2013-01-01

    No attention has been paid on comparing a set of genome sequences crossing genetic components and biological categories with far divergence over large size range. We define it as the systematic comparative genomics and aim to develop the methodology. First, we create a method, GenomeFingerprinter, to unambiguously produce a set of three-dimensional coordinates from a sequence, followed by one three-dimensional plot and six two-dimensional trajectory projections, to illustrate the genome fingerprint of a given genome sequence. Second, we develop a set of concepts and tools, and thereby establish a method called the universal genome fingerprint analysis (UGFA). Particularly, we define the total genetic component configuration (TGCC) (including chromosome, plasmid, and phage) for describing a strain as a systematic unit, the universal genome fingerprint map (UGFM) of TGCC for differentiating strains as a universal system, and the systematic comparative genomics (SCG) for comparing a set of genomes crossing genetic components and biological categories. Third, we construct a method of quantitative analysis to compare two genomes by using the outcome dataset of genome fingerprint analysis. Specifically, we define the geometric center and its geometric mean for a given genome fingerprint map, followed by the Euclidean distance, the differentiate rate, and the weighted differentiate rate to quantitatively describe the difference between two genomes of comparison. Moreover, we demonstrate the applications through case studies on various genome sequences, giving tremendous insights into the critical issues in microbial genomics and taxonomy. We have created a method, GenomeFingerprinter, for rapidly computing, geometrically visualizing, intuitively comparing a set of genomes at genome fingerprint level, and hence established a method called the universal genome fingerprint analysis, as well as developed a method of quantitative analysis of the outcome dataset. These have set up the methodology of systematic comparative genomics based on the genome fingerprint analysis.

  8. Improved epileptic seizure detection combining dynamic feature normalization with EEG novelty detection.

    PubMed

    Bogaarts, J G; Hilkman, D M W; Gommer, E D; van Kranen-Mastenbroek, V H J M; Reulen, J P H

    2016-12-01

    Continuous electroencephalographic monitoring of critically ill patients is an established procedure in intensive care units. Seizure detection algorithms, such as support vector machines (SVM), play a prominent role in this procedure. To correct for inter-human differences in EEG characteristics, as well as for intra-human EEG variability over time, dynamic EEG feature normalization is essential. Recently, the median decaying memory (MDM) approach was determined to be the best method of normalization. MDM uses a sliding baseline buffer of EEG epochs to calculate feature normalization constants. However, while this method does include non-seizure EEG epochs, it also includes EEG activity that can have a detrimental effect on the normalization and subsequent seizure detection performance. In this study, EEG data that is to be incorporated into the baseline buffer are automatically selected based on a novelty detection algorithm (Novelty-MDM). Performance of an SVM-based seizure detection framework is evaluated in 17 long-term ICU registrations using the area under the sensitivity-specificity ROC curve. This evaluation compares three different EEG normalization methods, namely a fixed baseline buffer (FB), the median decaying memory (MDM) approach, and our novelty median decaying memory (Novelty-MDM) method. It is demonstrated that MDM did not improve overall performance compared to FB (p < 0.27), partly because seizure like episodes were included in the baseline. More importantly, Novelty-MDM significantly outperforms both FB (p = 0.015) and MDM (p = 0.0065).

  9. Extraction of organic contaminants from marine sediments and tissues using microwave energy.

    PubMed

    Jayaraman, S; Pruell, R J; McKinney, R

    2001-07-01

    In this study, we compared microwave solvent extraction (MSE) to conventional methods for extracting organic contaminants from marine sediments and tissues with high and varying moisture content. The organic contaminants measured were polychlorinated biphenyl (PCB) congeners, chlorinated pesticides, and polycyclic aromatic hydrocarbons (PAHs). Initial experiments were conducted on dry standard reference materials (SRMs) and field collected marine sediments. Moisture content in samples greatly influenced the recovery of the analytes of interest. When wet sediments were included in a sample batch, low recoveries were often encountered in other samples in the batch, including the dry SRM. Experiments were conducted to test the effect of standardizing the moisture content in all samples in a batch prior to extraction. SRM1941a (marine sediment). SRM1974a (mussel tissue), as well as QA96SED6 (marine sediment), and QA96TIS7 (marine tissue), both from 1996 NIST Intercalibration Exercise were extracted using microwave and conventional methods. Moisture levels were adjusted in SRMs to match those of marine sediment and tissue samples before microwave extraction. The results demonstrated that it is crucial to standardize the moisture content in all samples, including dry reference material to ensure good recovery of organic contaminants. MSE yielded equivalent or superior recoveries compared to conventional methods for the majority of the compounds evaluated. The advantages of MSE over conventional methods are reduced solvent usage, higher sample throughput and the elimination of halogenated solvent usage.

  10. An Introduction to Photomicrography.

    ERIC Educational Resources Information Center

    Judson, Peter

    1979-01-01

    Described are various methods for producing black and white photographs of microscope slides using single lens reflex, fixed lens, and plate cameras. Procedures for illumination, film processing, mounting, and projection are also discussed. A table of comparative film speeds is included. (CS)

  11. Comparison of Efficacy of Eye Movement, Desensitization and Reprocessing and Cognitive Behavioral Therapy Therapeutic Methods for Reducing Anxiety and Depression of Iranian Combatant Afflicted by Post Traumatic Stress Disorder

    NASA Astrophysics Data System (ADS)

    Narimani, M.; Sadeghieh Ahari, S.; Rajabi, S.

    This research aims to determine efficacy of two therapeutic methods and compare them; Eye Movement, Desensitization and Reprocessing (EMDR) and Cognitive Behavioral Therapy (CBT) for reduction of anxiety and depression of Iranian combatant afflicted with Post traumatic Stress Disorder (PTSD) after imposed war. Statistical population of current study includes combatants afflicted with PTSD that were hospitalized in Isar Hospital of Ardabil province or were inhabited in Ardabil. These persons were selected through simple random sampling and were randomly located in three groups. The method was extended test method and study design was multi-group test-retest. Used tools include hospital anxiety and depression scale. This survey showed that exercise of EMDR and CBT has caused significant reduction of anxiety and depression.

  12. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    PubMed

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  13. Comparison of methods for measuring atmospheric deposition of arsenic, cadmium, nickel and lead.

    PubMed

    Aas, Wenche; Alleman, Laurent Y; Bieber, Elke; Gladtke, Dieter; Houdret, Jean-Luc; Karlsson, Vuokko; Monies, Christian

    2009-06-01

    A comprehensive field intercomparison at four different types of European sites (two rural, one urban and one industrial) comparing three different collectors (wet only, bulk and Bergerhoff samplers) was conducted in the framework of the European Committee for Standardization (CEN) to create an European standard for the deposition of the four elements As, Cd, Ni and Pb. The purpose was to determine whether the proposed methods lead to results within the uncertainty required by the EU's daughter directive (70%). The main conclusion is that a different sampling strategy is needed for rural and industrial sites. Thus, the conclusions on uncertainties and sample approach are presented separately for the different approaches. The wet only and bulk collector ("bulk bottle method") are comparable at wet rural sites where the total deposition arises mainly from precipitation, the expanded uncertainty when comparing these two types of sampler are below 45% for As, Cd and Pb, 67% for Ni. At industrial sites and possibly very dry rural and urban sites it is necessary to use Bergerhoff samplers or a "bulk bottle+funnel method". It is not possible to address the total deposition estimation with these methods, but they will give the lowest estimate of the total deposition. The expanded uncertainties when comparing the Bergerhoff and the bulk bottle+funnel methods are below 50% for As and Cd, and 63% for Pb. The uncertainty for Ni was not addressed since the bulk bottle+funnel method did not include a full digestion procedure which is necessary for sites with high loads of undissolved metals. The lowest estimate can however be calculated by comparing parallel Bergerhoff samplers where the expanded uncertainty for Ni was 24%. The reproducibility is comparable to the between sampler/method uncertainties. Sampling and sample preparation were proved to be the main factors in the uncertainty budget of deposition measurements.

  14. Preliminary evaluation of a gel tube agglutination major cross-match method in dogs.

    PubMed

    Villarnovo, Dania; Burton, Shelley A; Horney, Barbara S; MacKenzie, Allan L; Vanderstichel, Raphaël

    2016-09-01

    A major cross-match gel tube test is available for use in dogs yet has not been clinically evaluated. This study compared cross-match results obtained using the gel tube and the standard tube methods for canine samples. Study 1 included 107 canine sample donor-recipient pairings cross-match tested with the RapidVet-H method gel tube test and compared results with the standard tube method. Additionally, 120 pairings using pooled sera containing anti-canine erythrocyte antibody at various concentrations were tested with leftover blood from a hospital population to assess sensitivity and specificity of the gel tube method in comparison with the standard method. The gel tube method had a good relative specificity of 96.1% in detecting lack of agglutination (compatibility) compared to the standard tube method. Agreement between the 2 methods was moderate. Nine of 107 pairings showed agglutination/incompatibility on either test, too few to allow reliable calculation of relative sensitivity. Fifty percent of the gel tube method results were difficult to interpret due to sample spreading in the reaction and/or negative control tubes. The RapidVet-H method agreed with the standard cross-match method on compatible samples, but detected incompatibility in some sample pairs that were compatible with the standard method. Evaluation using larger numbers of incompatible pairings is needed to assess diagnostic utility. The gel tube method results were difficult to categorize due to sample spreading. Weak agglutination reactions or other factors such as centrifuge model may be responsible. © 2016 American Society for Veterinary Clinical Pathology.

  15. Statistical procedures for evaluating daily and monthly hydrologic model predictions

    USGS Publications Warehouse

    Coffey, M.E.; Workman, S.R.; Taraba, J.L.; Fogle, A.W.

    2004-01-01

    The overall study objective was to evaluate the applicability of different qualitative and quantitative methods for comparing daily and monthly SWAT computer model hydrologic streamflow predictions to observed data, and to recommend statistical methods for use in future model evaluations. Statistical methods were tested using daily streamflows and monthly equivalent runoff depths. The statistical techniques included linear regression, Nash-Sutcliffe efficiency, nonparametric tests, t-test, objective functions, autocorrelation, and cross-correlation. None of the methods specifically applied to the non-normal distribution and dependence between data points for the daily predicted and observed data. Of the tested methods, median objective functions, sign test, autocorrelation, and cross-correlation were most applicable for the daily data. The robust coefficient of determination (CD*) and robust modeling efficiency (EF*) objective functions were the preferred methods for daily model results due to the ease of comparing these values with a fixed ideal reference value of one. Predicted and observed monthly totals were more normally distributed, and there was less dependence between individual monthly totals than was observed for the corresponding predicted and observed daily values. More statistical methods were available for comparing SWAT model-predicted and observed monthly totals. The 1995 monthly SWAT model predictions and observed data had a regression Rr2 of 0.70, a Nash-Sutcliffe efficiency of 0.41, and the t-test failed to reject the equal data means hypothesis. The Nash-Sutcliffe coefficient and the R r2 coefficient were the preferred methods for monthly results due to the ability to compare these coefficients to a set ideal value of one.

  16. Pattern and Process in the Comparative Study of Convergent Evolution.

    PubMed

    Mahler, D Luke; Weber, Marjorie G; Wagner, Catherine E; Ingram, Travis

    2017-08-01

    Understanding processes that have shaped broad-scale biodiversity patterns is a fundamental goal in evolutionary biology. The development of phylogenetic comparative methods has yielded a tool kit for analyzing contemporary patterns by explicitly modeling processes of change in the past, providing neontologists tools for asking questions previously accessible only for select taxa via the fossil record or laboratory experimentation. The comparative approach, however, differs operationally from alternative approaches to studying convergence in that, for studies of only extant species, convergence must be inferred using evolutionary process models rather than being directly measured. As a result, investigation of evolutionary pattern and process cannot be decoupled in comparative studies of convergence, even though such a decoupling could in theory guard against adaptationist bias. Assumptions about evolutionary process underlying comparative tools can shape the inference of convergent pattern in sometimes profound ways and can color interpretation of such patterns. We discuss these issues and other limitations common to most phylogenetic comparative approaches and suggest ways that they can be avoided in practice. We conclude by promoting a multipronged approach to studying convergence that integrates comparative methods with complementary tests of evolutionary mechanisms and includes ecological and biogeographical perspectives. Carefully employed, the comparative method remains a powerful tool for enriching our understanding of convergence in macroevolution, especially for investigation of why convergence occurs in some settings but not others.

  17. Effects of Health Level 7 Messaging on Data Quality in New York City's Immunization Information System, 2014

    PubMed Central

    Papadouka, Vikki; Ternier, Alexandra; Zucker, Jane R.

    2016-01-01

    Objective We compared the quality of data reported to New York City's immunization information system, the Citywide Immunization Registry (CIR), through its real-time Health Level 7 (HL7) Web service from electronic health records (EHRs), with data submitted through other methods. Methods We stratified immunizations administered and reported to the CIR in 2014 for patients aged 0–18 years by reporting method: (1) sending HL7 messages from EHRs through the Web service, (2) manual data entry, and (3) upload of a non-standard flat file from EHRs. We assessed completeness of reporting by measuring the percentage of immunizations reported with lot number, manufacturer, and Vaccines for Children (VFC) program eligibility. We assessed timeliness of reporting by determining the number of days from date of administration to date entered into the CIR. Results HL7 reporting accounted for the largest percentage (46.3%) of the 3.8 million immunizations reported in 2014. Of immunizations reported using HL7, 97.9% included the lot number and 92.6% included the manufacturer, compared with 50.4% and 48.0% for manual entry, and 65.9% and 48.8% for non-standard flat file, respectively. VFC eligibility was 96.9% complete when reported by manual data entry, 95.3% complete for HL7 reporting, and 87.2% complete for non-standard flat file reporting. Of the three reporting methods, HL7 was the most timely: 77.6% of immunizations were reported by HL7 in <1 day, compared with 53.6% of immunizations reported through manual data entry and 18.1% of immunizations reported through non-standard flat file. Conclusion HL7 reporting from EHRs resulted in more complete and timely data in the CIR compared with other reporting methods. Providing resources to facilitate HL7 reporting from EHRs to immunization information systems to increase data quality should be a priority for public health. PMID:27453603

  18. The contribution of Raman spectroscopy to the analytical quality control of cytotoxic drugs in a hospital environment: eliminating the exposure risks for staff members and their work environment.

    PubMed

    Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette

    2014-08-15

    The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. A quantitative analysis of qualitative studies in clinical journals for the 2000 publishing year

    PubMed Central

    McKibbon, Kathleen Ann; Gadd, Cynthia S

    2004-01-01

    Background Quantitative studies are becoming more recognized as important to understanding health care with all of its richness and complexities. The purpose of this descriptive survey was to provide a quantitative evaluation of the qualitative studies published in 170 core clinical journals for 2000. Methods All identified studies that used qualitative methods were reviewed to ascertain which clinical journals publish qualitative studies and to extract research methods, content (persons and health care issues studied), and whether mixed methods (quantitative and qualitative methods) were used. Results 60 330 articles were reviewed. 355 reports of original qualitative studies and 12 systematic review articles were identified in 48 journals. Most of the journals were in the discipline of nursing. Only 4 of the most highly cited health care journals, based on ISI Science Citation Index (SCI) Impact Factors, published qualitative studies. 37 of the 355 original reports used both qualitative and quantitative (mixed) methods. Patients and non-health care settings were the most common groups of people studied. Diseases and conditions were cancer, mental health, pregnancy and childbirth, and cerebrovascular disease with many other diseases and conditions represented. Phenomenology and grounded theory were commonly used; substantial ethnography was also present. No substantial differences were noted for content or methods when articles published in all disciplines were compared with articles published in nursing titles or when studies with mixed methods were compared with studies that included only qualitative methods. Conclusions The clinical literature includes many qualitative studies although they are often published in nursing journals or journals with low SCI Impact Factor journals. Many qualitative studies incorporate both qualitative and quantitative methods. PMID:15271221

  20. Flow and Turbulence Modeling and Computation of Shock Buffet Onset for Conventional and Supercritical Airfoils

    NASA Technical Reports Server (NTRS)

    Bartels, Robert E.

    1998-01-01

    Flow and turbulence models applied to the problem of shock buffet onset are studied. The accuracy of the interactive boundary layer and the thin-layer Navier-Stokes equations solved with recent upwind techniques using similar transport field equation turbulence models is assessed for standard steady test cases, including conditions having significant shock separation. The two methods are found to compare well in the shock buffet onset region of a supercritical airfoil that involves strong trailing-edge separation. A computational analysis using the interactive-boundary layer has revealed a Reynolds scaling effect in the shock buffet onset of the supercritical airfoil, which compares well with experiment. The methods are next applied to a conventional airfoil. Steady shock-separated computations of the conventional airfoil with the two methods compare well with experiment. Although the interactive boundary layer computations in the shock buffet region compare well with experiment for the conventional airfoil, the thin-layer Navier-Stokes computations do not. These findings are discussed in connection with possible mechanisms important in the onset of shock buffet and the constraints imposed by current numerical modeling techniques.

  1. Orientation of airborne laser scanning point clouds with multi-view, multi-scale image blocks.

    PubMed

    Rönnholm, Petri; Hyyppä, Hannu; Hyyppä, Juha; Haggrén, Henrik

    2009-01-01

    Comprehensive 3D modeling of our environment requires integration of terrestrial and airborne data, which is collected, preferably, using laser scanning and photogrammetric methods. However, integration of these multi-source data requires accurate relative orientations. In this article, two methods for solving relative orientation problems are presented. The first method includes registration by minimizing the distances between of an airborne laser point cloud and a 3D model. The 3D model was derived from photogrammetric measurements and terrestrial laser scanning points. The first method was used as a reference and for validation. Having completed registration in the object space, the relative orientation between images and laser point cloud is known. The second method utilizes an interactive orientation method between a multi-scale image block and a laser point cloud. The multi-scale image block includes both aerial and terrestrial images. Experiments with the multi-scale image block revealed that the accuracy of a relative orientation increased when more images were included in the block. The orientations of the first and second methods were compared. The comparison showed that correct rotations were the most difficult to detect accurately by using the interactive method. Because the interactive method forces laser scanning data to fit with the images, inaccurate rotations cause corresponding shifts to image positions. However, in a test case, in which the orientation differences included only shifts, the interactive method could solve the relative orientation of an aerial image and airborne laser scanning data repeatedly within a couple of centimeters.

  2. Orientation of Airborne Laser Scanning Point Clouds with Multi-View, Multi-Scale Image Blocks

    PubMed Central

    Rönnholm, Petri; Hyyppä, Hannu; Hyyppä, Juha; Haggrén, Henrik

    2009-01-01

    Comprehensive 3D modeling of our environment requires integration of terrestrial and airborne data, which is collected, preferably, using laser scanning and photogrammetric methods. However, integration of these multi-source data requires accurate relative orientations. In this article, two methods for solving relative orientation problems are presented. The first method includes registration by minimizing the distances between of an airborne laser point cloud and a 3D model. The 3D model was derived from photogrammetric measurements and terrestrial laser scanning points. The first method was used as a reference and for validation. Having completed registration in the object space, the relative orientation between images and laser point cloud is known. The second method utilizes an interactive orientation method between a multi-scale image block and a laser point cloud. The multi-scale image block includes both aerial and terrestrial images. Experiments with the multi-scale image block revealed that the accuracy of a relative orientation increased when more images were included in the block. The orientations of the first and second methods were compared. The comparison showed that correct rotations were the most difficult to detect accurately by using the interactive method. Because the interactive method forces laser scanning data to fit with the images, inaccurate rotations cause corresponding shifts to image positions. However, in a test case, in which the orientation differences included only shifts, the interactive method could solve the relative orientation of an aerial image and airborne laser scanning data repeatedly within a couple of centimeters. PMID:22454569

  3. Comparing methods of measuring geographic patterns in temporal trends: an application to county-level heart disease mortality in the United States, 1973 to 2010.

    PubMed

    Vaughan, Adam S; Kramer, Michael R; Waller, Lance A; Schieb, Linda J; Greer, Sophia; Casper, Michele

    2015-05-01

    To demonstrate the implications of choosing analytical methods for quantifying spatiotemporal trends, we compare the assumptions, implementation, and outcomes of popular methods using county-level heart disease mortality in the United States between 1973 and 2010. We applied four regression-based approaches (joinpoint regression, both aspatial and spatial generalized linear mixed models, and Bayesian space-time model) and compared resulting inferences for geographic patterns of local estimates of annual percent change and associated uncertainty. The average local percent change in heart disease mortality from each method was -4.5%, with the Bayesian model having the smallest range of values. The associated uncertainty in percent change differed markedly across the methods, with the Bayesian space-time model producing the narrowest range of variance (0.0-0.8). The geographic pattern of percent change was consistent across methods with smaller declines in the South Central United States and larger declines in the Northeast and Midwest. However, the geographic patterns of uncertainty differed markedly between methods. The similarity of results, including geographic patterns, for magnitude of percent change across these methods validates the underlying spatial pattern of declines in heart disease mortality. However, marked differences in degree of uncertainty indicate that Bayesian modeling offers substantially more precise estimates. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Performance of the Lot Quality Assurance Sampling Method Compared to Surveillance for Identifying Inadequately-performing Areas in Matlab, Bangladesh

    PubMed Central

    Hanifi, S.M.A.; Roy, Nikhil; Streatfield, P. Kim

    2007-01-01

    This paper compared the performance of the lot quality assurance sampling (LQAS) method in identifying inadequately-performing health work-areas with that of using health and demographic surveillance system (HDSS) data and examined the feasibility of applying the method by field-level programme supervisors. The study was carried out in Matlab, the field site of ICDDR,B, where a HDSS has been in place for over 30 years. The LQAS method was applied in 57 work-areas of community health workers in ICDDR,B-served areas in Matlab during July-September 2002. The performance of the LQAS method in identifying work-areas with adequate and inadequate coverage of various health services was compared with those of the HDSS. The health service-coverage indicators included coverage of DPT, measles, BCG vaccination, and contraceptive use. It was observed that the difference in the proportion of work-areas identified to be inadequately performing using the LQAS method with less than 30 respondents, and the HDSS was not statistically significant. The consistency between the LQAS method and the HDSS in identifying work-areas was greater for adequately-performing areas than inadequately-performing areas. It was also observed that the field managers could be trained to apply the LQAS method in monitoring their performance in reaching the target population. PMID:17615902

  5. Performance of the lot quality assurance sampling method compared to surveillance for identifying inadequately-performing areas in Matlab, Bangladesh.

    PubMed

    Bhuiya, Abbas; Hanifi, S M A; Roy, Nikhil; Streatfield, P Kim

    2007-03-01

    This paper compared the performance of the lot quality assurance sampling (LQAS) method in identifying inadequately-performing health work-areas with that of using health and demographic surveillance system (HDSS) data and examined the feasibility of applying the method by field-level programme supervisors. The study was carried out in Matlab, the field site of ICDDR,B, where a HDSS has been in place for over 30 years. The LQAS method was applied in 57 work-areas of community health workers in ICDDR,B-served areas in Matlab during July-September 2002. The performance of the LQAS method in identifying work-areas with adequate and inadequate coverage of various health services was compared with those of the HDSS. The health service-coverage indicators included coverage of DPT, measles, BCG vaccination, and contraceptive use. It was observed that the difference in the proportion of work-areas identified to be inadequately performing using the LQAS method with less than 30 respondents, and the HDSS was not statistically significant. The consistency between the LQAS method and the HDSS in identifying work-areas was greater for adequately-performing areas than inadequately-performing areas. It was also observed that the field managers could be trained to apply the LQAS method in monitoring their performance in reaching the target population.

  6. [Introduction and advantage analysis of the stepwise method for the construction of vascular trees].

    PubMed

    Zhang, Yan; Xie, Haiwei; Zhu, Kai

    2010-08-01

    A new method for constructing the model of vascular trees was proposed in this paper. By use of this method, the arterial trees in good agreement with the actual structure could be grown. In this process, all vessels in the vascular tree were divided into two groups: the conveying vessels, and the delivering branches. And different branches could be built by different ways. Firstly, the distributing rules of conveying vessels were ascertained by use of measurement data, and then the conveying vessels were constructed in accordance to the statistical rule and optimization criterion. Lastly, delivering branches were modeled by constrained constructive optimization (CCO) on the conveying vessel-trees which had already been generated. In order to compare the CCO method and stepwise method proposed here, two 3D arterial trees of human tongue were grown with their vascular tree having a special structure. Based on the corrosion casts of real arterial tree of human tongue, the data about the two trees constructed by different methods were compared and analyzed, including the averaged segment diameters at respective levels, the distribution and the diameters of the branches of first level at respective directions. The results show that the vascular tree built by stepwise method is more similar to the true arterial of human tongue when compared against the tree built by CCO method.

  7. [Study on trace elements of lake sediments by ICP-AES and XRF core scanning].

    PubMed

    Cheng, Ai-Ying; Yu, Jun-Qing; Gao, Chun-Liang; Zhang, Li-Sha; He, Xian-Hu

    2013-07-01

    It is the first time to study sediment of Toson lake in Qaidam Basin. Trace elements including Cd, Cr, Cu, Zn and Pb in lake sediment were measured by ICP-AES method, studied and optimized from different resolution methods respectively, and finally determined a optimum pretreatment system for sediment of Toson lake, namely, HCl-HNO3-HF-HClO4-H2O2 system in the proportions of 5 : 5 : 5 : 1 : 1 was determined. At the same time, the data measured by XRF core scanning were compared, the use of moisture content correction method was analyzed, and the influence of the moisture content on the scanning method was discussed. The results showed that, compared to the background value, the contents of Cd and Zn were a little higher, the content of Cr, Cu and Pb was within the background value limits. XRF core scanning was controlled by sediment elements as well as water content in sediment to some extent. The results by the two methods showed a significant positive correlation, with the correlation coefficient up to 0.673-0.925, and they have a great comparability.

  8. Multi-objective optimization for an automated and simultaneous phase and baseline correction of NMR spectral data

    NASA Astrophysics Data System (ADS)

    Sawall, Mathias; von Harbou, Erik; Moog, Annekathrin; Behrens, Richard; Schröder, Henning; Simoneau, Joël; Steimers, Ellen; Neymeyr, Klaus

    2018-04-01

    Spectral data preprocessing is an integral and sometimes inevitable part of chemometric analyses. For Nuclear Magnetic Resonance (NMR) spectra a possible first preprocessing step is a phase correction which is applied to the Fourier transformed free induction decay (FID) signal. This preprocessing step can be followed by a separate baseline correction step. Especially if series of high-resolution spectra are considered, then automated and computationally fast preprocessing routines are desirable. A new method is suggested that applies the phase and the baseline corrections simultaneously in an automated form without manual input, which distinguishes this work from other approaches. The underlying multi-objective optimization or Pareto optimization provides improved results compared to consecutively applied correction steps. The optimization process uses an objective function which applies strong penalty constraints and weaker regularization conditions. The new method includes an approach for the detection of zero baseline regions. The baseline correction uses a modified Whittaker smoother. The functionality of the new method is demonstrated for experimental NMR spectra. The results are verified against gravimetric data. The method is compared to alternative preprocessing tools. Additionally, the simultaneous correction method is compared to a consecutive application of the two correction steps.

  9. Quantifying the biases in metagenome mining for realistic assessment of microbial ecology of naturally fermented foods.

    PubMed

    Keisam, Santosh; Romi, Wahengbam; Ahmed, Giasuddin; Jeyaram, Kumaraswamy

    2016-09-27

    Cultivation-independent investigation of microbial ecology is biased by the DNA extraction methods used. We aimed to quantify those biases by comparative analysis of the metagenome mined from four diverse naturally fermented foods (bamboo shoot, milk, fish, soybean) using eight different DNA extraction methods with different cell lysis principles. Our findings revealed that the enzymatic lysis yielded higher eubacterial and yeast metagenomic DNA from the food matrices compared to the widely used chemical and mechanical lysis principles. Further analysis of the bacterial community structure by Illumina MiSeq amplicon sequencing revealed a high recovery of lactic acid bacteria by the enzymatic lysis in all food types. However, Bacillaceae, Acetobacteraceae, Clostridiaceae and Proteobacteria were more abundantly recovered when mechanical and chemical lysis principles were applied. The biases generated due to the differential recovery of operational taxonomic units (OTUs) by different DNA extraction methods including DNA and PCR amplicons mix from different methods have been quantitatively demonstrated here. The different methods shared only 29.9-52.0% of the total OTUs recovered. Although similar comparative research has been performed on other ecological niches, this is the first in-depth investigation of quantifying the biases in metagenome mining from naturally fermented foods.

  10. Analysis of volatile organic compounds. [trace amounts of organic volatiles in gas samples

    NASA Technical Reports Server (NTRS)

    Zlatkis, A. (Inventor)

    1977-01-01

    An apparatus and method are described for reproducibly analyzing trace amounts of a large number of organic volatiles existing in a gas sample. Direct injection of the trapped volatiles into a cryogenic percolum provides a sharply defined plug. Applications of the method include: (1) analyzing the headspace gas of body fluids and comparing a profile of the organic volatiles with standard profiles for the detection and monitoring of disease; (2) analyzing the headspace gas of foods and beverages and comparing the profile with standard profiles to monitor and control flavor and aroma; and (3) analyses for determining the organic pollutants in air or water samples.

  11. On the Finite Element Implementation of the Generalized Method of Cells Micromechanics Constitutive Model

    NASA Technical Reports Server (NTRS)

    Wilt, T. E.

    1995-01-01

    The Generalized Method of Cells (GMC), a micromechanics based constitutive model, is implemented into the finite element code MARC using the user subroutine HYPELA. Comparisons in terms of transverse deformation response, micro stress and strain distributions, and required CPU time are presented for GMC and finite element models of fiber/matrix unit cell. GMC is shown to provide comparable predictions of the composite behavior and requires significantly less CPU time as compared to a finite element analysis of the unit cell. Details as to the organization of the HYPELA code are provided with the actual HYPELA code included in the appendix.

  12. Costs and Efficiency of Online and Offline Recruitment Methods: A Web-Based Cohort Study.

    PubMed

    Christensen, Tina; Riis, Anders H; Hatch, Elizabeth E; Wise, Lauren A; Nielsen, Marie G; Rothman, Kenneth J; Toft Sørensen, Henrik; Mikkelsen, Ellen M

    2017-03-01

    The Internet is widely used to conduct research studies on health issues. Many different methods are used to recruit participants for such studies, but little is known about how various recruitment methods compare in terms of efficiency and costs. The aim of our study was to compare online and offline recruitment methods for Internet-based studies in terms of efficiency (number of recruited participants) and costs per participant. We employed several online and offline recruitment methods to enroll 18- to 45-year-old women in an Internet-based Danish prospective cohort study on fertility. Offline methods included press releases, posters, and flyers. Online methods comprised advertisements placed on five different websites, including Facebook and Netdoktor.dk. We defined seven categories of mutually exclusive recruitment methods and used electronic tracking via unique Uniform Resource Locator (URL) and self-reported data to identify the recruitment method for each participant. For each method, we calculated the average cost per participant and efficiency, that is, the total number of recruited participants. We recruited 8252 study participants. Of these, 534 were excluded as they could not be assigned to a specific recruitment method. The final study population included 7724 participants, of whom 803 (10.4%) were recruited by offline methods, 3985 (51.6%) by online methods, 2382 (30.8%) by online methods not initiated by us, and 554 (7.2%) by other methods. Overall, the average cost per participant was €6.22 for online methods initiated by us versus €9.06 for offline methods. Costs per participant ranged from €2.74 to €105.53 for online methods and from €0 to €67.50 for offline methods. Lowest average costs per participant were for those recruited from Netdoktor.dk (€2.99) and from Facebook (€3.44). In our Internet-based cohort study, online recruitment methods were superior to offline methods in terms of efficiency (total number of participants enrolled). The average cost per recruited participant was also lower for online than for offline methods, although costs varied greatly among both online and offline recruitment methods. We observed a decrease in the efficiency of some online recruitment methods over time, suggesting that it may be optimal to adopt multiple online methods. ©Tina Christensen, Anders H Riis, Elizabeth E Hatch, Lauren A Wise, Marie G Nielsen, Kenneth J Rothman, Henrik Toft Sørensen, Ellen M Mikkelsen. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 01.03.2017.

  13. Rapid fusion method for the determination of Pu, Np, and Am in large soil samples

    DOE PAGES

    Maxwell, Sherrod L.; Culligan, Brian; Hutchison, Jay B.; ...

    2015-02-14

    A new rapid sodium hydroxide fusion method for the preparation of 10-20 g soil samples has been developed by the Savannah River National Laboratory (SRNL). The method enables lower detection limits for plutonium, neptunium, and americium in environmental soil samples. The method also significantly reduces sample processing time and acid fume generation compared to traditional soil digestion techniques using hydrofluoric acid. Ten gram soil aliquots can be ashed and fused using the new method in 1-2 hours, completely dissolving samples, including refractory particles. Pu, Np and Am are separated using stacked 2mL cartridges of TEVA and DGA Resin and measuredmore » using alpha spectrometry. The method can be adapted for measurement by inductively-coupled plasma mass spectrometry (ICP-MS). Two 10 g soil aliquots of fused soil may be combined prior to chromatographic separations to further improve detection limits. Total sample preparation time, including chromatographic separations and alpha spectrometry source preparation, is less than 8 hours.« less

  14. Selecting risk factors: a comparison of discriminant analysis, logistic regression and Cox's regression model using data from the Tromsø Heart Study.

    PubMed

    Brenn, T; Arnesen, E

    1985-01-01

    For comparative evaluation, discriminant analysis, logistic regression and Cox's model were used to select risk factors for total and coronary deaths among 6595 men aged 20-49 followed for 9 years. Groups with mortality between 5 and 93 per 1000 were considered. Discriminant analysis selected variable sets only marginally different from the logistic and Cox methods which always selected the same sets. A time-saving option, offered for both the logistic and Cox selection, showed no advantage compared with discriminant analysis. Analysing more than 3800 subjects, the logistic and Cox methods consumed, respectively, 80 and 10 times more computer time than discriminant analysis. When including the same set of variables in non-stepwise analyses, all methods estimated coefficients that in most cases were almost identical. In conclusion, discriminant analysis is advocated for preliminary or stepwise analysis, otherwise Cox's method should be used.

  15. Comparison between the use of an ultrasonic tip and a microhead handpiece in periradicular surgery: a prospective randomised trial.

    PubMed

    Shearer, Jane; McManners, Joseph

    2009-07-01

    Innovations in periradicular surgery for failed treatment of orthograde root canal disease have been well-documented. We know of no prospective studies that have compared success rates of conventional methods with these presumed advances. In this prospective randomised trial we compare the use of an ultrasonic retrotip with a microhead bur in the preparation of a retrograde cavity. Outcome was estimated clinically by estimation of pain, swelling, and sinus, and radiographically by looking at infill of bone and retrograde root filling 2 weeks and 6 months postoperatively. Both methods used other surgical techniques including microinstruments to place the retrograde root filling. The success rate of the ultrasonic method was higher (all patients, n=26) than that of the microhead method (n=19 of 21). A larger study with longer follow up is required to consolidate this evidence.

  16. Characterization of primary standards for use in the HPLC analysis of the procyanidin content of cocoa and chocolate containing products.

    PubMed

    Hurst, William J; Stanley, Bruce; Glinski, Jan A; Davey, Matthew; Payne, Mark J; Stuart, David A

    2009-10-15

    This report describes the characterization of a series of commercially available procyanidin standards ranging from dimers DP = 2 to decamers DP = 10 for the determination of procyanidins from cocoa and chocolate. Using a combination of HPLC with fluorescence detection and MALDI-TOF mass spectrometry, the purity of each standard was determined and these data were used to determine relative response factors. These response factors were compared with other response factors obtained from published methods. Data comparing the procyanidin analysis of a commercially available US dark chocolate calculated using each of the calibration methods indicates divergent results and demonstrate that previous methods may significantly underreport the procyanidins in cocoa-containing products. These results have far reaching implications because the previous calibration methods have been used to develop data for a variety of scientific reports, including food databases and clinical studies.

  17. A new way of analyzing occlusion 3 dimensionally.

    PubMed

    Hayasaki, Haruaki; Martins, Renato Parsekian; Gandini, Luiz Gonzaga; Saitoh, Issei; Nonaka, Kazuaki

    2005-07-01

    This article introduces a new method for 3-dimensional dental cast analysis, by using a mechanical 3-dimensional digitizer, MicroScribe 3DX (Immersion, San Jose, Calif), and TIGARO software (not yet released, but available from the author at hayasaki@dent.kyushu-u.ac.jp ). By digitizing points on the model, multiple measurements can be made, including tooth dimensions; arch length, width, and perimeter; curve of Spee; overjet and overbite; and anteroposterior discrepancy. The bias of the system can be evaluated by comparing the distance between 2 points as determined by the new system and as measured with digital calipers. Fifteen pairs of models were measured digitally and manually, and the bias was evaluated by comparing the variances of both methods and checking for the type of error obtained by each method. No systematic errors were found. The results showed that the method is accurate, and it can be applied to both clinical practice and research.

  18. Comparing and Contrasting Consensus versus Empirical Domains

    PubMed Central

    Jason, Leonard A.; Kot, Bobby; Sunnquist, Madison; Brown, Abigail; Reed, Jordan; Furst, Jacob; Newton, Julia L.; Strand, Elin Bolle; Vernon, Suzanne D.

    2015-01-01

    Background Since the publication of the CFS case definition [1], there have been a number of other criteria proposed including the Canadian Consensus Criteria [2] and the Myalgic Encephalomyelitis: International Consensus Criteria. [3] Purpose The current study compared these domains that were developed through consensus methods to one obtained through more empirical approaches using factor analysis. Methods Using data mining, we compared and contrasted fundamental features of consensus-based criteria versus empirical latent factors. In general, these approaches found the domain of Fatigue/Post-exertional malaise as best differentiating patients from controls. Results Findings indicated that the Fukuda et al. criteria had the worst sensitivity and specificity. Conclusions These outcomes might help both theorists and researchers better determine which fundamental domains to be used for the case definition. PMID:26977374

  19. Falsification Testing of Instrumental Variables Methods for Comparative Effectiveness Research.

    PubMed

    Pizer, Steven D

    2016-04-01

    To demonstrate how falsification tests can be used to evaluate instrumental variables methods applicable to a wide variety of comparative effectiveness research questions. Brief conceptual review of instrumental variables and falsification testing principles and techniques accompanied by an empirical application. Sample STATA code related to the empirical application is provided in the Appendix. Comparative long-term risks of sulfonylureas and thiazolidinediones for management of type 2 diabetes. Outcomes include mortality and hospitalization for an ambulatory care-sensitive condition. Prescribing pattern variations are used as instrumental variables. Falsification testing is an easily computed and powerful way to evaluate the validity of the key assumption underlying instrumental variables analysis. If falsification tests are used, instrumental variables techniques can help answer a multitude of important clinical questions. © Health Research and Educational Trust.

  20. LANDSAT-4 multispectral scanner (MSS) subsystem radiometric characterization

    NASA Technical Reports Server (NTRS)

    Alford, W. (Editor); Barker, J. (Editor); Clark, B. P.; Dasgupta, R.

    1983-01-01

    The multispectral band scanner (mass) and its spectral characteristics are described and methods are given for relating video digital levels on computer compatible tapes to radiance into the sensor. Topics covered include prelaunch calibration procedures and postlaunch radiometric processng. Examples of current data resident on the MSS image processing system are included. The MSS on LANDSAT 4 is compared with the scanners on earlier LANDSAT satellites.

  1. Use of the landmark method to address immortal person-time bias in comparative effectiveness research: a simulation study.

    PubMed

    Mi, Xiaojuan; Hammill, Bradley G; Curtis, Lesley H; Lai, Edward Chia-Cheng; Setoguchi, Soko

    2016-11-20

    Observational comparative effectiveness and safety studies are often subject to immortal person-time, a period of follow-up during which outcomes cannot occur because of the treatment definition. Common approaches, like excluding immortal time from the analysis or naïvely including immortal time in the analysis, are known to result in biased estimates of treatment effect. Other approaches, such as the Mantel-Byar and landmark methods, have been proposed to handle immortal time. Little is known about the performance of the landmark method in different scenarios. We conducted extensive Monte Carlo simulations to assess the performance of the landmark method compared with other methods in settings that reflect realistic scenarios. We considered four landmark times for the landmark method. We found that the Mantel-Byar method provided unbiased estimates in all scenarios, whereas the exclusion and naïve methods resulted in substantial bias when the hazard of the event was constant or decreased over time. The landmark method performed well in correcting immortal person-time bias in all scenarios when the treatment effect was small, and provided unbiased estimates when there was no treatment effect. The bias associated with the landmark method tended to be small when the treatment rate was higher in the early follow-up period than it was later. These findings were confirmed in a case study of chronic obstructive pulmonary disease. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. A comparative evaluation of the effect of internet-based CME delivery format on satisfaction, knowledge and confidence

    PubMed Central

    2010-01-01

    Background Internet-based instruction in continuing medical education (CME) has been associated with favorable outcomes. However, more direct comparative studies of different Internet-based interventions, instructional methods, presentation formats, and approaches to implementation are needed. The purpose of this study was to conduct a comparative evaluation of two Internet-based CME delivery formats and the effect on satisfaction, knowledge and confidence outcomes. Methods Evaluative outcomes of two differing formats of an Internet-based CME course with identical subject matter were compared. A Scheduled Group Learning format involved case-based asynchronous discussions with peers and a facilitator over a scheduled 3-week delivery period. An eCME On Demand format did not include facilitated discussion and was not based on a schedule; participants could start and finish at any time. A retrospective, pre-post evaluation study design comparing identical satisfaction, knowledge and confidence outcome measures was conducted. Results Participants in the Scheduled Group Learning format reported significantly higher mean satisfaction ratings in some areas, performed significantly higher on a post-knowledge assessment and reported significantly higher post-confidence scores than participants in the eCME On Demand format that was not scheduled and did not include facilitated discussion activity. Conclusions The findings support the instructional benefits of a scheduled delivery format and facilitated asynchronous discussion in Internet-based CME. PMID:20113493

  3. Inclusive breakup calculations in angular momentum basis: Application to 7Li+58Ni

    NASA Astrophysics Data System (ADS)

    Lei, Jin

    2018-03-01

    The angular momentum basis method is introduced to solve the inclusive breakup problem within the model proposed by Ichimura, Austern, and Vincent [Phys. Rev. C 32, 431 (1985), 10.1103/PhysRevC.32.431]. This method is based on the geometric transformation between different Jacobi coordinates, in which the particle spins can be included in a natural and efficient way. To test the validity of this partial wave expansion method, a benchmark calculation is done comparing with the one given by Lei and Moro [Phys. Rev. C 92, 044616 (2015), 10.1103/PhysRevC.92.044616]. In addition, using the distorted-wave Born approximation version of the IAV model, applications to 7Li+58Ni reactions at energies around Coulomb barrier are presented and compared with available data.

  4. Receiver operating characteristic (ROC) curves: review of methods with applications in diagnostic medicine

    NASA Astrophysics Data System (ADS)

    Obuchowski, Nancy A.; Bullen, Jennifer A.

    2018-04-01

    Receiver operating characteristic (ROC) analysis is a tool used to describe the discrimination accuracy of a diagnostic test or prediction model. While sensitivity and specificity are the basic metrics of accuracy, they have many limitations when characterizing test accuracy, particularly when comparing the accuracies of competing tests. In this article we review the basic study design features of ROC studies, illustrate sample size calculations, present statistical methods for measuring and comparing accuracy, and highlight commonly used ROC software. We include descriptions of multi-reader ROC study design and analysis, address frequently seen problems of verification and location bias, discuss clustered data, and provide strategies for testing endpoints in ROC studies. The methods are illustrated with a study of transmission ultrasound for diagnosing breast lesions.

  5. Research of an optimization design method of integral imaging three-dimensional display system

    NASA Astrophysics Data System (ADS)

    Gao, Hui; Yan, Zhiqiang; Wen, Jun; Jiang, Guanwu

    2016-03-01

    The information warfare needs a highly transparent environment of battlefield, it follows that true three-dimensional display technology has obvious advantages than traditional display technology in the current field of military science and technology. It also focuses on the research progress of lens array imaging technology and aims at what restrict the development of integral imaging, main including low spatial resolution, narrow depth range and small viewing angle. This paper summarizes the principle, characteristics and development history of the integral imaging. A variety of methods are compared and analyzed that how to improve the resolution, extend depth of field, increase scope and eliminate the artifact aiming at problems currently. And makes a discussion about the experimental results of the research, comparing the display performance of different methods.

  6. Electronic Versus Manual Data Processing: Evaluating the Use of Electronic Health Records in Out-of-Hospital Clinical Research

    PubMed Central

    Newgard, Craig D.; Zive, Dana; Jui, Jonathan; Weathers, Cody; Daya, Mohamud

    2011-01-01

    Objectives To compare case ascertainment, agreement, validity, and missing values for clinical research data obtained, processed, and linked electronically from electronic health records (EHR), compared to “manual” data processing and record abstraction in a cohort of out-ofhospital trauma patients. Methods This was a secondary analysis of two sets of data collected for a prospective, population-based, out-of-hospital trauma cohort evaluated by 10 emergency medical services (EMS) agencies transporting to 16 hospitals, from January 1, 2006 through October 2, 2007. Eighteen clinical, operational, procedural, and outcome variables were collected and processed separately and independently using two parallel data processing strategies, by personnel blinded to patients in the other group. The electronic approach included electronic health record data exports from EMS agencies, reformatting and probabilistic linkage to outcomes from local trauma registries and state discharge databases. The manual data processing approach included chart matching, data abstraction, and data entry by a trained abstractor. Descriptive statistics, measures of agreement, and validity were used to compare the two approaches to data processing. Results During the 21-month period, 418 patients underwent both data processing methods and formed the primary cohort. Agreement was good to excellent (kappa 0.76 to 0.97; intraclass correlation coefficient 0.49 to 0.97), with exact agreement in 67% to 99% of cases, and a median difference of zero for all continuous and ordinal variables. The proportions of missing out-of-hospital values were similar between the two approaches, although electronic processing generated more missing outcomes (87 out of 418, 21%, 95% CI = 17% to 25%) than the manual approach (11 out of 418, 3%, 95% CI = 1% to 5%). Case ascertainment of eligible injured patients was greater using electronic methods (n = 3,008) compared to manual methods (n = 629). Conclusions In this sample of out-of-hospital trauma patients, an all-electronic data processing strategy identified more patients and generated values with good agreement and validity compared to traditional data collection and processing methods. PMID:22320373

  7. Are QALYs based on time trade-off comparable?--A systematic review of TTO methodologies.

    PubMed

    Arnesen, Trude; Trommald, Mari

    2005-01-01

    A wide range of methods is used to elicit quality-of-life weights of different health states to generate 'Quality-adjusted life years' (QALYs). The comparability between different types of health outcomes at a numerical level is the main advantage of using a 'common currency for health' such as the QALY. It has been warned that results of different methods and perspectives should not be directly compared in QALY league tables. But do we know that QALYs are comparable if they are based on the same method and perspective?The Time trade-off (TTO) consists in a hypothetical trade-off between living shorter and living healthier. We performed a literature review of the TTO methodology used to elicit quality-of-life weights for own, current health. Fifty-six journal articles, with quality-of-life weights assigned to 102 diagnostic groups were included. We found extensive differences in how the TTO question was asked. The time frame varied from 1 month to 30 years, and was not reported for one-fourth of the weights. The samples in which the quality-of-life weights were elicited were generally small with a median size of 53 respondents. Comprehensive inclusion criteria were given for half the diagnostic groups. Co-morbidity was described in less than one-tenth of the groups of respondents. For two-thirds of the quality-of-life weights, there was no discussion of the influence of other factors, such as age, sex, employment and children. The different methodological approaches did not influence the TTO weights in a predictable or clear pattern. Whether or not it is possible to standardise the TTO method and the sampling procedure, and whether or not the TTO will then give valid quality-of-life weights, remains an open question.This review of the TTO elicited on own behalf, shows that limiting cost-utility analysis to include only quality life weights from one method and one perspective is not enough to ensure that QALYs are comparable. Copyright 2004 John Wiley & Sons, Ltd.

  8. Activity of a long-acting echinocandin, CD101, determined using CLSI and EUCAST reference methods, against Candida and Aspergillus spp., including echinocandin- and azole-resistant isolates.

    PubMed

    Pfaller, Michael A; Messer, Shawn A; Rhomberg, Paul R; Jones, Ronald N; Castanheira, Mariana

    2016-10-01

    The objective of this study was to evaluate the in vitro activity of CD101, a novel echinocandin with a long serum elimination half-life, and comparator (anidulafungin and caspofungin) antifungal agents against a collection of Candida and Aspergillus spp. isolates. CD101 and comparator agents were tested against 106 Candida spp. and 67 Aspergillus spp. isolates, including 27 isolates of Candida harbouring fks hotspot mutations and 12 itraconazole non-WT Aspergillus, using CLSI and EUCAST reference susceptibility broth microdilution (BMD) methods. Against WT and fks mutant Candida albicans, Candida glabrata and Candida tropicalis, the activity of CD101 [MIC90 = 0.06, 0.12 and 0.03 mg/L, respectively (CLSI method values)] was comparable to that of anidulafungin (MIC90 = 0.03, 0.12 and 0.03 mg/L, respectively) and caspofungin (MIC90 = 0.12, 0.25 and 0.12 mg/L, respectively). WT Candida krusei isolates were very susceptible to CD101 (MIC = 0.06 mg/L). CD101 activity (MIC50/90 = 1/2 mg/L) was comparable to that of anidulafungin (MIC50/90 = 2/2 mg/L) against Candida parapsilosis. CD101 (MIC mode = 0.06 mg/L for C. glabrata) was 2- to 4-fold more active against fks hotspot mutants than caspofungin (MIC mode = 0.5 mg/L). CD101 was active against Aspergillus fumigatus, Aspergillus terreus, Aspergillus niger and Aspergillus flavus (MEC90 range = ≤0.008-0.03 mg/L). The essential agreement between CLSI and EUCAST methods for CD101 was 92.0%-100.0% among Candida spp. and 95.0%-100.0% among Aspergillus spp. The activity of CD101 is comparable to that of other members of the echinocandin class for the prevention and treatment of serious fungal infections. Similar results for CD101 activity versus Candida and Aspergillus spp. may be obtained with either CLSI or EUCAST BMD methods. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy.

  9. The Quality of Methods Reporting in Parasitology Experiments

    PubMed Central

    Flórez-Vargas, Oscar; Bramhall, Michael; Noyes, Harry; Cruickshank, Sheena; Stevens, Robert; Brass, Andy

    2014-01-01

    There is a growing concern both inside and outside the scientific community over the lack of reproducibility of experiments. The depth and detail of reported methods are critical to the reproducibility of findings, but also for making it possible to compare and integrate data from different studies. In this study, we evaluated in detail the methods reporting in a comprehensive set of trypanosomiasis experiments that should enable valid reproduction, integration and comparison of research findings. We evaluated a subset of other parasitic (Leishmania, Toxoplasma, Plasmodium, Trichuris and Schistosoma) and non-parasitic (Mycobacterium) experimental infections in order to compare the quality of method reporting more generally. A systematic review using PubMed (2000–2012) of all publications describing gene expression in cells and animals infected with Trypanosoma spp was undertaken based on PRISMA guidelines; 23 papers were identified and included. We defined a checklist of essential parameters that should be reported and have scored the number of those parameters that are reported for each publication. Bibliometric parameters (impact factor, citations and h-index) were used to look for association between Journal and Author status and the quality of method reporting. Trichuriasis experiments achieved the highest scores and included the only paper to score 100% in all criteria. The mean of scores achieved by Trypanosoma articles through the checklist was 65.5% (range 32–90%). Bibliometric parameters were not correlated with the quality of method reporting (Spearman's rank correlation coefficient <−0.5; p>0.05). Our results indicate that the quality of methods reporting in experimental parasitology is a cause for concern and it has not improved over time, despite there being evidence that most of the assessed parameters do influence the results. We propose that our set of parameters be used as guidelines to improve the quality of the reporting of experimental infection models as a pre-requisite for integrating and comparing sets of data. PMID:25076044

  10. The quality of methods reporting in parasitology experiments.

    PubMed

    Flórez-Vargas, Oscar; Bramhall, Michael; Noyes, Harry; Cruickshank, Sheena; Stevens, Robert; Brass, Andy

    2014-01-01

    There is a growing concern both inside and outside the scientific community over the lack of reproducibility of experiments. The depth and detail of reported methods are critical to the reproducibility of findings, but also for making it possible to compare and integrate data from different studies. In this study, we evaluated in detail the methods reporting in a comprehensive set of trypanosomiasis experiments that should enable valid reproduction, integration and comparison of research findings. We evaluated a subset of other parasitic (Leishmania, Toxoplasma, Plasmodium, Trichuris and Schistosoma) and non-parasitic (Mycobacterium) experimental infections in order to compare the quality of method reporting more generally. A systematic review using PubMed (2000-2012) of all publications describing gene expression in cells and animals infected with Trypanosoma spp was undertaken based on PRISMA guidelines; 23 papers were identified and included. We defined a checklist of essential parameters that should be reported and have scored the number of those parameters that are reported for each publication. Bibliometric parameters (impact factor, citations and h-index) were used to look for association between Journal and Author status and the quality of method reporting. Trichuriasis experiments achieved the highest scores and included the only paper to score 100% in all criteria. The mean of scores achieved by Trypanosoma articles through the checklist was 65.5% (range 32-90%). Bibliometric parameters were not correlated with the quality of method reporting (Spearman's rank correlation coefficient <-0.5; p>0.05). Our results indicate that the quality of methods reporting in experimental parasitology is a cause for concern and it has not improved over time, despite there being evidence that most of the assessed parameters do influence the results. We propose that our set of parameters be used as guidelines to improve the quality of the reporting of experimental infection models as a pre-requisite for integrating and comparing sets of data.

  11. Comparative effectiveness research on patients with acute ischemic stroke using Markov decision processes

    PubMed Central

    2012-01-01

    Background Several methodological issues with non-randomized comparative clinical studies have been raised, one of which is whether the methods used can adequately identify uncertainties that evolve dynamically with time in real-world systems. The objective of this study is to compare the effectiveness of different combinations of Traditional Chinese Medicine (TCM) treatments and combinations of TCM and Western medicine interventions in patients with acute ischemic stroke (AIS) by using Markov decision process (MDP) theory. MDP theory appears to be a promising new method for use in comparative effectiveness research. Methods The electronic health records (EHR) of patients with AIS hospitalized at the 2nd Affiliated Hospital of Guangzhou University of Chinese Medicine between May 2005 and July 2008 were collected. Each record was portioned into two "state-action-reward" stages divided by three time points: the first, third, and last day of hospital stay. We used the well-developed optimality technique in MDP theory with the finite horizon criterion to make the dynamic comparison of different treatment combinations. Results A total of 1504 records with a primary diagnosis of AIS were identified. Only states with more than 10 (including 10) patients' information were included, which gave 960 records to be enrolled in the MDP model. Optimal combinations were obtained for 30 types of patient condition. Conclusion MDP theory makes it possible to dynamically compare the effectiveness of different combinations of treatments. However, the optimal interventions obtained by the MDP theory here require further validation in clinical practice. Further exploratory studies with MDP theory in other areas in which complex interventions are common would be worthwhile. PMID:22400712

  12. Evaluation and Comparison of Multiple Test Methods, Including Real-time PCR, for Legionella Detection in Clinical Specimens

    PubMed Central

    Peci, Adriana; Winter, Anne-Luise; Gubbay, Jonathan B.

    2016-01-01

    Legionella is a Gram-negative bacterium that can cause Pontiac fever, a mild upper respiratory infection and Legionnaire’s disease, a more severe illness. We aimed to compare the performance of urine antigen, culture, and polymerase chain reaction (PCR) test methods and to determine if sputum is an acceptable alternative to the use of more invasive bronchoalveolar lavage (BAL). Data for this study included specimens tested for Legionella at Public Health Ontario Laboratories from 1st January, 2010 to 30th April, 2014, as part of routine clinical testing. We found sensitivity of urinary antigen test (UAT) compared to culture to be 87%, specificity 94.7%, positive predictive value (PPV) 63.8%, and negative predictive value (NPV) 98.5%. Sensitivity of UAT compared to PCR was 74.7%, specificity 98.3%, PPV 77.7%, and NPV 98.1%. Out of 146 patients who had a Legionella-positive result by PCR, only 66 (45.2%) also had a positive result by culture. Sensitivity for culture was the same using either sputum or BAL (13.6%); sensitivity for PCR was 10.3% for sputum and 12.8% for BAL. Both sputum and BAL yield similar results regardless testing methods (Fisher Exact p-values = 1.0, for each test). In summary, all test methods have inherent weaknesses in identifying Legionella; therefore, more than one testing method should be used. Obtaining a single specimen type from patients with pneumonia limits the ability to diagnose Legionella, particularly when urine is the specimen type submitted. Given ease of collection and similar sensitivity to BAL, clinicians are encouraged to submit sputum in addition to urine when BAL submission is not practical from patients being tested for Legionella. PMID:27630979

  13. Cleaning Hospital Room Surfaces to Prevent Health Care-Associated Infections: A Technical Brief.

    PubMed

    Han, Jennifer H; Sullivan, Nancy; Leas, Brian F; Pegues, David A; Kaczmarek, Janice L; Umscheid, Craig A

    2015-10-20

    The cleaning of hard surfaces in hospital rooms is critical for reducing health care-associated infections. This review describes the evidence examining current methods of cleaning, disinfecting, and monitoring cleanliness of patient rooms, as well as contextual factors that may affect implementation and effectiveness. Key informants were interviewed, and a systematic search for publications since 1990 was done with the use of several bibliographic and gray literature resources. Studies examining surface contamination, colonization, or infection with Clostridium difficile, methicillin-resistant Staphylococcus aureus, or vancomycin-resistant enterococci were included. Eighty studies were identified-76 primary studies and 4 systematic reviews. Forty-nine studies examined cleaning methods, 14 evaluated monitoring strategies, and 17 addressed challenges or facilitators to implementation. Only 5 studies were randomized, controlled trials, and surface contamination was the most commonly assessed outcome. Comparative effectiveness studies of disinfecting methods and monitoring strategies were uncommon. Future research should evaluate and compare newly emerging strategies, such as self-disinfecting coatings for disinfecting and adenosine triphosphate and ultraviolet/fluorescent surface markers for monitoring. Studies should also assess patient-centered outcomes, such as infection, when possible. Other challenges include identifying high-touch surfaces that confer the greatest risk for pathogen transmission; developing standard thresholds for defining cleanliness; and using methods to adjust for confounders, such as hand hygiene, when examining the effect of disinfecting methods.

  14. Cleaning Hospital Room Surfaces to Prevent Health Care–Associated Infections

    PubMed Central

    Han, Jennifer H.; Sullivan, Nancy; Leas, Brian F.; Pegues, David A.; Kaczmarek, Janice L.; Umscheid, Craig A.

    2015-01-01

    The cleaning of hard surfaces in hospital rooms is critical for reducing health care–associated infections. This review describes the evidence examining current methods of cleaning, disinfecting, and monitoring cleanliness of patient rooms, as well as contextual factors that may affect implementation and effectiveness. Key informants were interviewed, and a systematic search for publications since 1990 was done with the use of several bibliographic and gray literature resources. Studies examining surface contamination, colonization, or infection with Clostridium difficile, methicillin-resistant Staphylococcus aureus, or vancomycinresistant enterococci were included. Eighty studies were identified—76 primary studies and 4 systematic reviews. Forty-nine studies examined cleaning methods, 14 evaluated monitoring strategies, and 17 addressed challenges or facilitators to implementation. Only 5 studies were randomized, controlled trials, and surface contamination was the most commonly assessed outcome. Comparative effectiveness studies of disinfecting methods and monitoring strategies were uncommon. Future research should evaluate and compare newly emerging strategies, such as self-disinfecting coatings for disinfecting and adenosine triphosphate and ultraviolet/fluorescent surface markers for monitoring. Studies should also assess patient-centered outcomes, such as infection, when possible. Other challenges include identifying high-touch surfaces that confer the greatest risk for pathogen transmission; developing standard thresholds for defining cleanliness; and using methods to adjust for confounders, such as hand hygiene, when examining the effect of disinfecting methods. PMID:26258903

  15. Evaluation of Yogurt Microstructure Using Confocal Laser Scanning Microscopy and Image Analysis.

    PubMed

    Skytte, Jacob L; Ghita, Ovidiu; Whelan, Paul F; Andersen, Ulf; Møller, Flemming; Dahl, Anders B; Larsen, Rasmus

    2015-06-01

    The microstructure of protein networks in yogurts defines important physical properties of the yogurt and hereby partly its quality. Imaging this protein network using confocal scanning laser microscopy (CSLM) has shown good results, and CSLM has become a standard measuring technique for fermented dairy products. When studying such networks, hundreds of images can be obtained, and here image analysis methods are essential for using the images in statistical analysis. Previously, methods including gray level co-occurrence matrix analysis and fractal analysis have been used with success. However, a range of other image texture characterization methods exists. These methods describe an image by a frequency distribution of predefined image features (denoted textons). Our contribution is an investigation of the choice of image analysis methods by performing a comparative study of 7 major approaches to image texture description. Here, CSLM images from a yogurt fermentation study are investigated, where production factors including fat content, protein content, heat treatment, and incubation temperature are varied. The descriptors are evaluated through nearest neighbor classification, variance analysis, and cluster analysis. Our investigation suggests that the texton-based descriptors provide a fuller description of the images compared to gray-level co-occurrence matrix descriptors and fractal analysis, while still being as applicable and in some cases as easy to tune. © 2015 Institute of Food Technologists®

  16. Mating programs including genomic relationships and dominance effects

    USDA-ARS?s Scientific Manuscript database

    Breed associations, artificial-insemination organizations, and on-farm software providers need new computerized mating programs for genomic selection so that genomic inbreeding could be minimized by comparing genotypes of potential mates. Efficient methods for transferring elements of the genomic re...

  17. A coupling method for a cardiovascular simulation model which includes the Kalman filter.

    PubMed

    Hasegawa, Yuki; Shimayoshi, Takao; Amano, Akira; Matsuda, Tetsuya

    2012-01-01

    Multi-scale models of the cardiovascular system provide new insight that was unavailable with in vivo and in vitro experiments. For the cardiovascular system, multi-scale simulations provide a valuable perspective in analyzing the interaction of three phenomenons occurring at different spatial scales: circulatory hemodynamics, ventricular structural dynamics, and myocardial excitation-contraction. In order to simulate these interactions, multiscale cardiovascular simulation systems couple models that simulate different phenomena. However, coupling methods require a significant amount of calculation, since a system of non-linear equations must be solved for each timestep. Therefore, we proposed a coupling method which decreases the amount of calculation by using the Kalman filter. In our method, the Kalman filter calculates approximations for the solution to the system of non-linear equations at each timestep. The approximations are then used as initial values for solving the system of non-linear equations. The proposed method decreases the number of iterations required by 94.0% compared to the conventional strong coupling method. When compared with a smoothing spline predictor, the proposed method required 49.4% fewer iterations.

  18. An improved panel method for the solution of three-dimensional leading-edge vortex flows. Volume 1: Theory document

    NASA Technical Reports Server (NTRS)

    Johnson, F. T.; Lu, P.; Tinoco, E. N.

    1980-01-01

    An improved panel method for the solution of three dimensional flow and wing and wing-body combinations with leading edge vortex separation is presented. The method employs a three dimensional inviscid flow model in which the configuration, the rolled-up vortex sheets, and the wake are represented by quadratic doublet distributions. The strength of the singularity distribution as well as shape and position of the vortex spirals are computed in an iterative fashion starting with an assumed initial sheet geometry. The method calculates forces and moments as well as detail surface pressure distributions. Improvements include the implementation of improved panel numerics for the purpose of elimination the highly nonlinear effects of ring vortices around double panel edges, and the development of a least squares procedure for damping vortex sheet geometry update instabilities. A complete description of the method is included. A variety of cases generated by the computer program implementing the method are presented which verify the mathematical assumptions of the method and which compare computed results with experimental data to verify the underlying physical assumptions made by the method.

  19. Efficacy and Safety of Long-Acting Reversible Contraception

    PubMed Central

    Stoddard, Amy; McNicholas, Colleen; Peipert, Jeffrey F.

    2013-01-01

    Long-acting reversible contraception (LARC) includes intrauterine devices (IUDs) and the subdermal implant. These methods are the most effective reversible methods of contraception, and have the additional advantages of being long-lasting, convenient, well liked by users and cost effective. Compared with other user-dependent methods that increase the risk of noncompliance-related method failure, LARC methods can bring ‘typical use’ failure rates more in line with ‘perfect use’ failure rates. LARC methods are ‘forgettable’; they are not dependent on compliance with a pill-taking regimen, remembering to change a patch or ring, or coming back to the clinician for an injection. LARC method failure rates rival that of tubal sterilization at <1% for IUDs and the subdermal implant. For these reasons, we believe that IUDs and implants should be offered as first-line contraception for most women. This article provides a review of the LARC methods that are currently available in the US, including their effectiveness, advantages, disadvantages and contraindications. Additionally, we dispel myths and misconceptions regarding IUDs, and address the barriers to LARC use. PMID:21668037

  20. Ensemble Methods for Classification of Physical Activities from Wrist Accelerometry.

    PubMed

    Chowdhury, Alok Kumar; Tjondronegoro, Dian; Chandran, Vinod; Trost, Stewart G

    2017-09-01

    To investigate whether the use of ensemble learning algorithms improve physical activity recognition accuracy compared to the single classifier algorithms, and to compare the classification accuracy achieved by three conventional ensemble machine learning methods (bagging, boosting, random forest) and a custom ensemble model comprising four algorithms commonly used for activity recognition (binary decision tree, k nearest neighbor, support vector machine, and neural network). The study used three independent data sets that included wrist-worn accelerometer data. For each data set, a four-step classification framework consisting of data preprocessing, feature extraction, normalization and feature selection, and classifier training and testing was implemented. For the custom ensemble, decisions from the single classifiers were aggregated using three decision fusion methods: weighted majority vote, naïve Bayes combination, and behavior knowledge space combination. Classifiers were cross-validated using leave-one subject out cross-validation and compared on the basis of average F1 scores. In all three data sets, ensemble learning methods consistently outperformed the individual classifiers. Among the conventional ensemble methods, random forest models provided consistently high activity recognition; however, the custom ensemble model using weighted majority voting demonstrated the highest classification accuracy in two of the three data sets. Combining multiple individual classifiers using conventional or custom ensemble learning methods can improve activity recognition accuracy from wrist-worn accelerometer data.

  1. Quality evaluation of cook-chilled chicory stems (Cichorium intybus L., Catalogna group) by conventional and sous vide cooking methods.

    PubMed

    Renna, Massimiliano; Gonnella, Maria; Giannino, Donato; Santamaria, Pietro

    2014-03-15

    Chicory stems, appreciated both raw and cooked, represent a nutritious and refined food. In this study the effects on the quality of stems cooked by conventional (boiling, steaming and microwaving) and innovative (sous vide) methods were analysed. Several physical, chemical and sensory traits were compared using two local varieties (Galatina and Molfettese) of southern Italy (Puglia region). Independently of the variety, the sous vide method did not significantly affect (redness, yellowness and hue angle) or had the least impact on (lightness and total colour difference) quality parameters among the four methods as compared with the raw product. Following sensory analysis, the sous vide product always showed the highest score among the cooking methods. Moreover, this innovative method did not affect total phenol (TP) content and antioxidant activity (AA) compared with uncooked stems of both varieties. Microwaving increased TP content and AA (though associated with higher weight loss), while different responses depending on the chicory variety were observed after boiling and steaming. The results indicate the sous vide technique as optimal to preserve several traits, including organoleptic ones, for the quality of cook-chilled chicory stems. They also provide product-specific information usually required for cooking process strategies in the industrial sector of ready-to-eat vegetables. © 2013 Society of Chemical Industry.

  2. Sample preparation methods for scanning electron microscopy of homogenized Al-Mg-Si billets: A comparative study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Österreicher, Johannes Albert; Kumar, Manoj

    Characterization of Mg-Si precipitates is crucial for optimizing the homogenization heat treatment of Al-Mg-Si alloys. Although sample preparation is key for high quality scanning electron microscopy imaging, most common methods lead to dealloying of Mg-Si precipitates. In this article we systematically evaluate different sample preparation methods: mechanical polishing, etching with various reagents, and electropolishing using different electrolytes. We demonstrate that the use of a nitric acid and methanol electrolyte for electropolishing a homogenized Al-Mg-Si alloy prevents the dissolution of Mg-Si precipitates, resulting in micrographs of higher quality. This preparation method is investigated in depth and the obtained scanning electron microscopymore » images are compared with transmission electron micrographs: the shape and size of Mg-Si precipitates appear very similar in either method. The scanning electron micrographs allow proper identification and measurement of the Mg-Si phases including needles with lengths of roughly 200 nm. These needles are β″ precipitates as confirmed by high resolution transmission electron microscopy. - Highlights: •Secondary precipitation in homogenized 6xxx Al alloys is crucial for extrudability. •Existing sample preparation methods for SEM are improvable. •Electropolishing with nitric acid/methanol yields superior quality in SEM. •The obtained micrographs are compared to TEM micrographs.« less

  3. Assessing Interval Estimation Methods for Hill Model ...

    EPA Pesticide Factsheets

    The Hill model of concentration-response is ubiquitous in toxicology, perhaps because its parameters directly relate to biologically significant metrics of toxicity such as efficacy and potency. Point estimates of these parameters obtained through least squares regression or maximum likelihood are commonly used in high-throughput risk assessment, but such estimates typically fail to include reliable information concerning confidence in (or precision of) the estimates. To address this issue, we examined methods for assessing uncertainty in Hill model parameter estimates derived from concentration-response data. In particular, using a sample of ToxCast concentration-response data sets, we applied four methods for obtaining interval estimates that are based on asymptotic theory, bootstrapping (two varieties), and Bayesian parameter estimation, and then compared the results. These interval estimation methods generally did not agree, so we devised a simulation study to assess their relative performance. We generated simulated data by constructing four statistical error models capable of producing concentration-response data sets comparable to those observed in ToxCast. We then applied the four interval estimation methods to the simulated data and compared the actual coverage of the interval estimates to the nominal coverage (e.g., 95%) in order to quantify performance of each of the methods in a variety of cases (i.e., different values of the true Hill model paramet

  4. A Model for Understanding the Genetic Basis for Disparity in Prostate Cancer Risk

    DTIC Science & Technology

    2017-10-01

    times greater compared with European American men. The reasons for this disparity are not completely understood. Current tools in hand to study these...from iPSC of Caucasian and African-American foreskin fibroblasts and 3) compare and establish methods to transform differentiated prostate epithelial...NOT include the italicized descriptions of section contents in your submitted reports. 1. INTRODUCTION: Prostate cancer is the most commonly diagnosed

  5. GENOMIC DIVERSITY AND THE MICROENVIRONMENT AS DRIVERS OF PROGRESSION IN DCIS

    DTIC Science & Technology

    2017-10-01

    stains, including quantitative analysis, 7) Identification of upstaged DCIS cases for the radiology aim, 8) Development of image analysis methods for...goals of the project? Aim 1. Determine whether genetic diversity of DCIS is greater in DCIS with adjacent invasive disease compared to DCIS without... compared to DCIS without IDC. Since genomics is not the sole driver of tumor behavior, we will phenotypically characterize DCIS and its

  6. Building the Evidence Base for Decision-making in Cancer Genomic Medicine Using Comparative Effectiveness Research

    PubMed Central

    Goddard, Katrina A.B.; Knaus, William A.; Whitlock, Evelyn; Lyman, Gary H.; Feigelson, Heather Spencer; Schully, Sheri D.; Ramsey, Scott; Tunis, Sean; Freedman, Andrew N.; Khoury, Muin J.; Veenstra, David L.

    2013-01-01

    Background The clinical utility is uncertain for many cancer genomic applications. Comparative effectiveness research (CER) can provide evidence to clarify this uncertainty. Objectives To identify approaches to help stakeholders make evidence-based decisions, and to describe potential challenges and opportunities using CER to produce evidence-based guidance. Methods We identified general CER approaches for genomic applications through literature review, the authors’ experiences, and lessons learned from a recent, seven-site CER initiative in cancer genomic medicine. Case studies illustrate the use of CER approaches. Results Evidence generation and synthesis approaches include comparative observational and randomized trials, patient reported outcomes, decision modeling, and economic analysis. We identified significant challenges to conducting CER in cancer genomics: the rapid pace of innovation, the lack of regulation, the limited evidence for clinical utility, and the beliefs that genomic tests could have personal utility without having clinical utility. Opportunities to capitalize on CER methods in cancer genomics include improvements in the conduct of evidence synthesis, stakeholder engagement, increasing the number of comparative studies, and developing approaches to inform clinical guidelines and research prioritization. Conclusions CER offers a variety of methodological approaches to address stakeholders’ needs. Innovative approaches are needed to ensure an effective translation of genomic discoveries. PMID:22516979

  7. Effects of Environmental Toxicants on Metabolic Activity of Natural Microbial Communities

    PubMed Central

    Barnhart, Carole L. H.; Vestal, J. Robie

    1983-01-01

    Two methods of measuring microbial activity were used to study the effects of toxicants on natural microbial communities. The methods were compared for suitability for toxicity testing, sensitivity, and adaptability to field applications. This study included measurements of the incorporation of 14C-labeled acetate into microbial lipids and microbial glucosidase activity. Activities were measured per unit biomass, determined as lipid phosphate. The effects of various organic and inorganic toxicants on various natural microbial communities were studied. Both methods were useful in detecting toxicity, and their comparative sensitivities varied with the system studied. In one system, the methods showed approximately the same sensitivities in testing the effects of metals, but the acetate incorporation method was more sensitive in detecting the toxicity of organic compounds. The incorporation method was used to study the effects of a point source of pollution on the microbiota of a receiving stream. Toxic doses were found to be two orders of magnitude higher in sediments than in water taken from the same site, indicating chelation or adsorption of the toxicant by the sediment. The microbiota taken from below a point source outfall was 2 to 100 times more resistant to the toxicants tested than was that taken from above the outfall. Downstream filtrates in most cases had an inhibitory effect on the natural microbiota taken from above the pollution source. The microbial methods were compared with commonly used bioassay methods, using higher organisms, and were found to be similar in ability to detect comparative toxicities of compounds, but were less sensitive than methods which use standard media because of the influences of environmental factors. PMID:16346432

  8. Feature-level sentiment analysis by using comparative domain corpora

    NASA Astrophysics Data System (ADS)

    Quan, Changqin; Ren, Fuji

    2016-06-01

    Feature-level sentiment analysis (SA) is able to provide more fine-grained SA on certain opinion targets and has a wider range of applications on E-business. This study proposes an approach based on comparative domain corpora for feature-level SA. The proposed approach makes use of word associations for domain-specific feature extraction. First, we assign a similarity score for each candidate feature to denote its similarity extent to a domain. Then we identify domain features based on their similarity scores on different comparative domain corpora. After that, dependency grammar and a general sentiment lexicon are applied to extract and expand feature-oriented opinion words. Lastly, the semantic orientation of a domain-specific feature is determined based on the feature-oriented opinion lexicons. In evaluation, we compare the proposed method with several state-of-the-art methods (including unsupervised and semi-supervised) using a standard product review test collection. The experimental results demonstrate the effectiveness of using comparative domain corpora.

  9. [Problem-based learning, a comparison in the acquisition of transversal competencies].

    PubMed

    González Pascual, Juan Luis; López Martin, Inmaculada; Toledo Gómez, David

    2009-01-01

    In the European Higher Education Area (EEES in Spanish reference), a change in the pedagogical model has occurred: from teaching centered on the figure of the professor to learning centered on students, from an integral perspective. This learning must bring together the full set of competencies included in the program requirements necessary to obtain a degree. The specific competencies characterize a profession and distinguish one from others. The transversal competencies surpass the limits of one particular discipline to be potentially developed in all; these are subdivided in three types: instrumental, interpersonal and systemic. The authors describe and compare the acquisition of transversal competencies connected to students' portfolios and Problem-based Learning as pedagogical methods from the perspective of second year nursing students at the European University in Madrid during the 2007-8 academic year To do so, the authors carried out a transversal descriptive study; data was collected by a purpose-made questionnaire the authors developed which they based on the transversal competencies of the Tuning Nursing Project. Variables included age, sex, pedagogical method, perception on acquisition of those 24 competencies by means of a Likert Scale. U de Mann-Whitney descriptive and analytical statistics. The authors conclude that the portfolio and Problem-based Learning are useful pedagogical methods for acquiring transversal competencies; these results coincide with those of other studies. Comparing both methods, the authors share the opinion that the Problem-based Learning method could stimulate the search for information better than the portfolio method.

  10. Validation of cone-beam computed tomography and magnetic resonance imaging of the porcine spine: a comparative study with multidetector computed tomography and anatomical specimens.

    PubMed

    de Freitas, Ricardo Miguel Costa; Andrade, Celi Santos; Caldas, José Guilherme Mendes Pereira; Kanas, Alexandre Fligelman; Cabral, Richard Halti; Tsunemi, Miriam Harumi; Rodríguez, Hernán Joel Cervantes; Rabbani, Said Rahnamaye

    2015-05-01

    New spinal interventions or implants have been tested on ex vivo or in vivo porcine spines, as they are readily available and have been accepted as a comparable model to human cadaver spines. Imaging-guided interventional procedures of the spine are mostly based on fluoroscopy or, still, on multidetector computed tomography (MDCT). Cone-beam computed tomography (CBCT) and magnetic resonance imaging (MRI) are also available methods to guide interventional procedures. Although some MDCT data from porcine spines are available in the literature, validation of the measurements on CBCT and MRI is lacking. To describe and compare the anatomical measurements accomplished with MDCT, CBCT, and MRI of lumbar porcine spines to determine if CBCT and MRI are also useful methods for experimental studies. An experimental descriptive-comparative study. Sixteen anatomical measurements of an individual vertebra from six lumbar porcine spines (n=36 vertebrae) were compared with their MDCT, CBCT, and MRI equivalents. Comparisons were made for the absolute values of the parameters. Similarities were found in all imaging methods. Significant correlation (p<.05) was observed with all variables except those that included cartilaginous tissue from the end plates when the anatomical study was compared with the imaging methods. The CBCT and MRI provided imaging measurements of the lumbar porcine spines that were similar to the anatomical and MDCT data, and they can be useful for specific experimental research studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Survival and growth of seed trees 20 years after a natural regeneration cut in the piedmont of Georgia

    Treesearch

    Stephen R. Logan; M. Boyd Edwards; Barry D. Shiver

    2005-01-01

    An experiment was installed in 1982 to compare six methods of natural regeneration in the Piedmont of Georgia. These methods include (1) clearcut with seed in place; (2) clearcut with seed in place and preharvest burn; (3) seed tree; (4) seed tree with preharvest burn; (5) shelterwood; and (6) shelterwood with preharvest burn. Because of endangered species regulations...

  12. Personalized Medicine in Veterans with Traumatic Brain Injuries

    DTIC Science & Technology

    2013-05-01

    Pair-Group Method using Arithmetic averages ( UPGMA ) based on cosine correlation of row mean centered log2 signal values; this was the top 50%-tile...cluster- ing was performed by the UPGMA method using Cosine correlation as the similarity metric. For comparative purposes, clustered heat maps included...non-mTBI cases were subjected to unsupervised hierarchical clustering analysis using the UPGMA algorithm with cosine correlation as the similarity

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seth, Arpan; Klise, Katherine A.; Siirola, John D.

    In the event of contamination in a water distribution network (WDN), source identification (SI) methods that analyze sensor data can be used to identify the source location(s). Knowledge of the source location and characteristics are important to inform contamination control and cleanup operations. Various SI strategies that have been developed by researchers differ in their underlying assumptions and solution techniques. The following manuscript presents a systematic procedure for testing and evaluating SI methods. The performance of these SI methods is affected by various factors including the size of WDN model, measurement error, modeling error, time and number of contaminant injections,more » and time and number of measurements. This paper includes test cases that vary these factors and evaluates three SI methods on the basis of accuracy and specificity. The tests are used to review and compare these different SI methods, highlighting their strengths in handling various identification scenarios. These SI methods and a testing framework that includes the test cases and analysis tools presented in this paper have been integrated into EPA’s Water Security Toolkit (WST), a suite of software tools to help researchers and others in the water industry evaluate and plan various response strategies in case of a contamination incident. Lastly, a set of recommendations are made for users to consider when working with different categories of SI methods.« less

  14. Comparative evaluation of different methods for calculation of cerebral blood flow (CBF) in nonanesthetized rabbits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Angelini, G.; Lanza, E.; Rozza Dionigi, A.

    1983-05-01

    The measurement of cerebral blood flow (CBF) by the extracranial detection of the radioactivity of /sup 133/Xe injected into an internal carotid artery has proved to be of considerable value for the investigation of cerebral circulation in conscious rabbits. Methods are described for calculating CBF from the curves of clearance of /sup 133/Xe, and include exponential analysis (two-component model), initial slope, and stochastic method. The different methods of curve analysis were compared in order to evaluate the fitness with the theoretical model. The initial slope and stochastic methods, compared with the biexponential model, underestimate the CBF by 35% and 46%more » respectively. Furthermore, the validity of recording the clearance curve for 10 min was tested by comparing these CBF values with those obtained from the whole curve. CBF values calculated with the shortened procedure are overestimated by 17%. A correlation exists between the ''10 min'' CBF values and the CBF calculated from the whole curve; in spite of that, the values are not accurate for limited animal populations or for single animals. The extent of the two main compartments into which the CBF is divided was also measured. There is no correlation between CBF values and the extent of the relative compartment. This fact suggests that these two parameters correspond to different biological entities.« less

  15. Gradient-based Electrical Properties Tomography (gEPT): a Robust Method for Mapping Electrical Properties of Biological Tissues In Vivo Using Magnetic Resonance Imaging

    PubMed Central

    Liu, Jiaen; Zhang, Xiaotong; Schmitter, Sebastian; Van de Moortele, Pierre-Francois; He, Bin

    2014-01-01

    Purpose To develop high-resolution electrical properties tomography (EPT) methods and investigate a gradient-based EPT (gEPT) approach which aims to reconstruct the electrical properties (EP), including conductivity and permittivity, of an imaged sample from experimentally measured B1 maps with improved boundary reconstruction and robustness against measurement noise. Theory and Methods Using a multi-channel transmit/receive stripline head coil, with acquired B1 maps for each coil element, by assuming negligible Bz component compared to transverse B1 components, a theory describing the relationship between B1 field, EP value and their spatial gradient has been proposed. The final EP images were obtained through spatial integration over the reconstructed EP gradient. Numerical simulation, physical phantom and in vivo human experiments at 7 T have been conducted to evaluate the performance of the proposed methods. Results Reconstruction results were compared with target EP values in both simulations and phantom experiments. Human experimental results were compared with EP values in literature. Satisfactory agreement was observed with improved boundary reconstruction. Importantly, the proposed gEPT method proved to be more robust against noise when compared to previously described non-gradient-based EPT approaches. Conclusion The proposed gEPT approach holds promises to improve EP mapping quality by recovering the boundary information and enhancing robustness against noise. PMID:25213371

  16. Introducing gender equity to adolescent school children: A mixed methods' study.

    PubMed

    Syed, Saba

    2017-01-01

    Over the past decade, gender equality and women's empowerment have been explicitly recognized as key not only to the health of nations but also to social and economic development. The aim of the present study was to assess the effectiveness of a mixed methods' participatory group education approach to introduce gender equity to adolescent school children. It also assessed baseline and postintervention knowledge, attitudes, and practices regarding gender equity, sexual and reproductive health among adolescent students in government-aided schools, and finally, compare the pre- and post-intervention gender equitable (GE) attitudes among the study participants. A government-aided school was selected by nonprobalistic intentional sampling. On 5 predesignated days, willing students were included in the intervention which included a pretest, a group of educational-based participatory mixed methods' intervention followed by a posttest assessment. A total of 186 students participated in the study. Girls had better baseline GE scores as compared to boys and they also improvised more on the baseline scores following the intervention. The present mixed method approach to introduce gender equity to adolescent school children through a group education-based interventional approach proved to be effective in initiating dialog and sensitizing adolescents on gender equity and violence within a school setting.

  17. Examining personalized feedback interventions for gambling disorders: A systematic review

    PubMed Central

    Marchica, Loredana; Derevensky, Jeffrey L.

    2016-01-01

    Background and aims Personalized feedback interventions (PFI) have shown success as a low-cost, scalable intervention for reducing problematic and excessive consumption of alcohol. Recently, researchers have begun to apply PFI as an intervention method for problematic gambling behaviors. A systematic review of the literature on PFI as an intervention/prevention method for gambling behaviors was performed. Methods Studies were included if they met the following criteria: the design included both a PFI group and a comparison group, and the interventions focused on gambling prevention and/or reduction. Six relevant studies were found meeting all criteria. Results Results revealed that PFI treatment groups showed decreases in a variety of gambling behaviors as compared to control groups, and perceived norms on gambling behaviors significantly decreased after interventions as compared to control groups. Conclusions Overall, the research suggests that while PFI applied to gambling is still in its infancy, problematic gamblers appear to benefit from programs incorporating PFIs. Further, PFI may also be used as a promising source of preventative measures for individuals displaying at-risk gambling behaviors. While, evidence is still limited, and additional research needs to be conducted with PFI for gambling problems, the preliminary positive results along with the structure of PFI as a scalable and relatively inexpensive intervention method provides promising support for future studies. PMID:28092190

  18. CALIBRATION OF SEMI-ANALYTIC MODELS OF GALAXY FORMATION USING PARTICLE SWARM OPTIMIZATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruiz, Andrés N.; Domínguez, Mariano J.; Yaryura, Yamila

    2015-03-10

    We present a fast and accurate method to select an optimal set of parameters in semi-analytic models of galaxy formation and evolution (SAMs). Our approach compares the results of a model against a set of observables applying a stochastic technique called Particle Swarm Optimization (PSO), a self-learning algorithm for localizing regions of maximum likelihood in multidimensional spaces that outperforms traditional sampling methods in terms of computational cost. We apply the PSO technique to the SAG semi-analytic model combined with merger trees extracted from a standard Lambda Cold Dark Matter N-body simulation. The calibration is performed using a combination of observedmore » galaxy properties as constraints, including the local stellar mass function and the black hole to bulge mass relation. We test the ability of the PSO algorithm to find the best set of free parameters of the model by comparing the results with those obtained using a MCMC exploration. Both methods find the same maximum likelihood region, however, the PSO method requires one order of magnitude fewer evaluations. This new approach allows a fast estimation of the best-fitting parameter set in multidimensional spaces, providing a practical tool to test the consequences of including other astrophysical processes in SAMs.« less

  19. Evaluation of Patient Handoff Methods on an Inpatient Teaching Service

    PubMed Central

    Craig, Steven R.; Smith, Hayden L.; Downen, A. Matthew; Yost, W. John

    2012-01-01

    Background The patient handoff process can be a highly variable and unstructured period at risk for communication errors. The morning sign-in process used by resident physicians at teaching hospitals typically involves less rigorous handoff protocols than the resident evening sign-out process. Little research has been conducted on best practices for handoffs during morning sign-in exchanges between resident physicians. Research must evaluate optimal protocols for the resident morning sign-in process. Methods Three morning handoff protocols consisting of written, electronic, and face-to-face methods were implemented over 3 study phases during an academic year. Study participants included all interns covering the internal medicine inpatient teaching service at a tertiary hospital. Study measures entailed intern survey-based interviews analyzed for failures in handoff protocols with or without missed pertinent information. Descriptive and comparative analyses examined study phase differences. Results A scheduled face-to-face handoff process had the fewest protocol deviations and demonstrated best communication of essential patient care information between cross-covering teams compared to written and electronic sign-in protocols. Conclusion Intern patient handoffs were more reliable when the sign-in protocol included scheduled face-to-face meetings. This method provided the best communication of patient care information and allowed for open exchanges of information. PMID:23267259

  20. Modeling nonlinear ultrasound propagation in heterogeneous media with power law absorption using a k-space pseudospectral method.

    PubMed

    Treeby, Bradley E; Jaros, Jiri; Rendell, Alistair P; Cox, B T

    2012-06-01

    The simulation of nonlinear ultrasound propagation through tissue realistic media has a wide range of practical applications. However, this is a computationally difficult problem due to the large size of the computational domain compared to the acoustic wavelength. Here, the k-space pseudospectral method is used to reduce the number of grid points required per wavelength for accurate simulations. The model is based on coupled first-order acoustic equations valid for nonlinear wave propagation in heterogeneous media with power law absorption. These are derived from the equations of fluid mechanics and include a pressure-density relation that incorporates the effects of nonlinearity, power law absorption, and medium heterogeneities. The additional terms accounting for convective nonlinearity and power law absorption are expressed as spatial gradients making them efficient to numerically encode. The governing equations are then discretized using a k-space pseudospectral technique in which the spatial gradients are computed using the Fourier-collocation method. This increases the accuracy of the gradient calculation and thus relaxes the requirement for dense computational grids compared to conventional finite difference methods. The accuracy and utility of the developed model is demonstrated via several numerical experiments, including the 3D simulation of the beam pattern from a clinical ultrasound probe.

  1. Study of the integration of wind tunnel and computational methods for aerodynamic configurations

    NASA Technical Reports Server (NTRS)

    Browne, Lindsey E.; Ashby, Dale L.

    1989-01-01

    A study was conducted to determine the effectiveness of using a low-order panel code to estimate wind tunnel wall corrections. The corrections were found by two computations. The first computation included the test model and the surrounding wind tunnel walls, while in the second computation the wind tunnel walls were removed. The difference between the force and moment coefficients obtained by comparing these two cases allowed the determination of the wall corrections. The technique was verified by matching the test-section, wall-pressure signature from a wind tunnel test with the signature predicted by the panel code. To prove the viability of the technique, two cases were considered. The first was a two-dimensional high-lift wing with a flap that was tested in the 7- by 10-foot wind tunnel at NASA Ames Research Center. The second was a 1/32-scale model of the F/A-18 aircraft which was tested in the low-speed wind tunnel at San Diego State University. The panel code used was PMARC (Panel Method Ames Research Center). Results of this study indicate that the proposed wind tunnel wall correction method is comparable to other methods and that it also inherently includes the corrections due to model blockage and wing lift.

  2. Molecular epidemiology of mastitis pathogens of dairy cattle and comparative relevance to humans.

    PubMed

    Zadoks, Ruth N; Middleton, John R; McDougall, Scott; Katholm, Jorgen; Schukken, Ynte H

    2011-12-01

    Mastitis, inflammation of the mammary gland, can be caused by a wide range of organisms, including gram-negative and gram-positive bacteria, mycoplasmas and algae. Many microbial species that are common causes of bovine mastitis, such as Escherichia coli, Klebsiella pneumoniae, Streptococcus agalactiae and Staphylococcus aureus also occur as commensals or pathogens of humans whereas other causative species, such as Streptococcus uberis, Streptococcus dysgalactiae subsp. dysgalactiae or Staphylococcus chromogenes, are almost exclusively found in animals. A wide range of molecular typing methods have been used in the past two decades to investigate the epidemiology of bovine mastitis at the subspecies level. These include comparative typing methods that are based on electrophoretic banding patterns, library typing methods that are based on the sequence of selected genes, virulence gene arrays and whole genome sequencing projects. The strain distribution of mastitis pathogens has been investigated within individual animals and across animals, herds, countries and host species, with consideration of the mammary gland, other animal or human body sites, and environmental sources. Molecular epidemiological studies have contributed considerably to our understanding of sources, transmission routes, and prognosis for many bovine mastitis pathogens and to our understanding of mechanisms of host-adaptation and disease causation. In this review, we summarize knowledge gleaned from two decades of molecular epidemiological studies of mastitis pathogens in dairy cattle and discuss aspects of comparative relevance to human medicine.

  3. Registration area and accuracy when integrating laser-scanned and maxillofacial cone-beam computed tomography images.

    PubMed

    Sun, LiJun; Hwang, Hyeon-Shik; Lee, Kyung-Min

    2018-03-01

    The purpose of this study was to examine changes in registration accuracy after including occlusal surface and incisal edge areas in addition to the buccal surface when integrating laser-scanned and maxillofacial cone-beam computed tomography (CBCT) dental images. CBCT scans and maxillary dental casts were obtained from 30 patients. Three methods were used to integrate the images: R1, only the buccal and labial surfaces were used; R2, the incisal edges of the anterior teeth and the buccal and distal marginal ridges of the second molars were used; and R3, labial surfaces, including incisal edges of anterior teeth, and buccal surfaces, including buccal and distal marginal ridges of the second molars, were used. Differences between the 2 images were evaluated by color-mapping methods and average surface distances by measuring the 3-dimensional Euclidean distances between the surface points on the 2 images. The R1 method showed more discrepancies between the laser-scanned and CBCT images than did the other methods. The R2 method did not show a significant difference in registration accuracy compared with the R3 method. The results of this study indicate that accuracy when integrating laser-scanned dental images into maxillofacial CBCT images can be increased by including occlusal surface and incisal edge areas as registration areas. Copyright © 2017 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  4. Comparison of Three Biomass Sampling Techniques on Submersed Aquatic Plants in a Northern Tier Lake

    DTIC Science & Technology

    2010-07-01

    distribution in 3 out of 14 species when comparing the box-core sampler and the rake method. These included forked duckweed (Lemna trisulca L, p...each site did not exhibit differences. These included coontail (p=0.2949), muskgrass (p=0.2746), American elodea (p=0.7622), forked duckweed (p...collected by the PVC-core sampler. These included coontail (p=0.000), chara (p=0.0219), American elodea (p=0.0061), forked duckweed (p=0.0000), najas (p

  5. Prioritization of candidate disease genes by topological similarity between disease and protein diffusion profiles.

    PubMed

    Zhu, Jie; Qin, Yufang; Liu, Taigang; Wang, Jun; Zheng, Xiaoqi

    2013-01-01

    Identification of gene-phenotype relationships is a fundamental challenge in human health clinic. Based on the observation that genes causing the same or similar phenotypes tend to correlate with each other in the protein-protein interaction network, a lot of network-based approaches were proposed based on different underlying models. A recent comparative study showed that diffusion-based methods achieve the state-of-the-art predictive performance. In this paper, a new diffusion-based method was proposed to prioritize candidate disease genes. Diffusion profile of a disease was defined as the stationary distribution of candidate genes given a random walk with restart where similarities between phenotypes are incorporated. Then, candidate disease genes are prioritized by comparing their diffusion profiles with that of the disease. Finally, the effectiveness of our method was demonstrated through the leave-one-out cross-validation against control genes from artificial linkage intervals and randomly chosen genes. Comparative study showed that our method achieves improved performance compared to some classical diffusion-based methods. To further illustrate our method, we used our algorithm to predict new causing genes of 16 multifactorial diseases including Prostate cancer and Alzheimer's disease, and the top predictions were in good consistent with literature reports. Our study indicates that integration of multiple information sources, especially the phenotype similarity profile data, and introduction of global similarity measure between disease and gene diffusion profiles are helpful for prioritizing candidate disease genes. Programs and data are available upon request.

  6. Multimethod Investigation of Interpersonal Functioning in Borderline Personality Disorder

    PubMed Central

    Stepp, Stephanie D.; Hallquist, Michael N.; Morse, Jennifer Q.; Pilkonis, Paul A.

    2011-01-01

    Even though interpersonal functioning is of great clinical importance for patients with borderline personality disorder (BPD), the comparative validity of different assessment methods for interpersonal dysfunction has not yet been tested. This study examined multiple methods of assessing interpersonal functioning, including self- and other-reports, clinical ratings, electronic diaries, and social cognitions in three groups of psychiatric patients (N=138): patients with (1) BPD, (2) another personality disorder, and (3) Axis I psychopathology only. Using dominance analysis, we examined the predictive validity of each method in detecting changes in symptom distress and social functioning six months later. Across multiple methods, the BPD group often reported higher interpersonal dysfunction scores compared to other groups. Predictive validity results demonstrated that self-report and electronic diary ratings were the most important predictors of distress and social functioning. Our findings suggest that self-report scores and electronic diary ratings have high clinical utility, as these methods appear most sensitive to change. PMID:21808661

  7. Semi-supervised vibration-based classification and condition monitoring of compressors

    NASA Astrophysics Data System (ADS)

    Potočnik, Primož; Govekar, Edvard

    2017-09-01

    Semi-supervised vibration-based classification and condition monitoring of the reciprocating compressors installed in refrigeration appliances is proposed in this paper. The method addresses the problem of industrial condition monitoring where prior class definitions are often not available or difficult to obtain from local experts. The proposed method combines feature extraction, principal component analysis, and statistical analysis for the extraction of initial class representatives, and compares the capability of various classification methods, including discriminant analysis (DA), neural networks (NN), support vector machines (SVM), and extreme learning machines (ELM). The use of the method is demonstrated on a case study which was based on industrially acquired vibration measurements of reciprocating compressors during the production of refrigeration appliances. The paper presents a comparative qualitative analysis of the applied classifiers, confirming the good performance of several nonlinear classifiers. If the model parameters are properly selected, then very good classification performance can be obtained from NN trained by Bayesian regularization, SVM and ELM classifiers. The method can be effectively applied for the industrial condition monitoring of compressors.

  8. Scoring from Contests

    PubMed Central

    Penn, Elizabeth Maggie

    2014-01-01

    This article presents a new model for scoring alternatives from “contest” outcomes. The model is a generalization of the method of paired comparison to accommodate comparisons between arbitrarily sized sets of alternatives in which outcomes are any division of a fixed prize. Our approach is also applicable to contests between varying quantities of alternatives. We prove that under a reasonable condition on the comparability of alternatives, there exists a unique collection of scores that produces accurate estimates of the overall performance of each alternative and satisfies a well-known axiom regarding choice probabilities. We apply the method to several problems in which varying choice sets and continuous outcomes may create problems for standard scoring methods. These problems include measuring centrality in network data and the scoring of political candidates via a “feeling thermometer.” In the latter case, we also use the method to uncover and solve a potential difficulty with common methods of rescaling thermometer data to account for issues of interpersonal comparability. PMID:24748759

  9. Systematic Comparison of Label-Free, Metabolic Labeling, and Isobaric Chemical Labeling for Quantitative Proteomics on LTQ Orbitrap Velos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Zhou; Adams, Rachel M; Chourey, Karuna

    2012-01-01

    A variety of quantitative proteomics methods have been developed, including label-free, metabolic labeling, and isobaric chemical labeling using iTRAQ or TMT. Here, these methods were compared in terms of the depth of proteome coverage, quantification accuracy, precision, and reproducibility using a high-performance hybrid mass spectrometer, LTQ Orbitrap Velos. Our results show that (1) the spectral counting method provides the deepest proteome coverage for identification, but its quantification performance is worse than labeling-based approaches, especially the quantification reproducibility; (2) metabolic labeling and isobaric chemical labeling are capable of accurate, precise, and reproducible quantification and provide deep proteome coverage for quantification. Isobaricmore » chemical labeling surpasses metabolic labeling in terms of quantification precision and reproducibility; (3) iTRAQ and TMT perform similarly in all aspects compared in the current study using a CID-HCD dual scan configuration. Based on the unique advantages of each method, we provide guidance for selection of the appropriate method for a quantitative proteomics study.« less

  10. An Investigation of Two Finite Element Modeling Solutions for Biomechanical Simulation Using a Case Study of a Mandibular Bone.

    PubMed

    Liu, Yun-Feng; Fan, Ying-Ying; Dong, Hui-Yue; Zhang, Jian-Xing

    2017-12-01

    The method used in biomechanical modeling for finite element method (FEM) analysis needs to deliver accurate results. There are currently two solutions used in FEM modeling for biomedical model of human bone from computerized tomography (CT) images: one is based on a triangular mesh and the other is based on the parametric surface model and is more popular in practice. The outline and modeling procedures for the two solutions are compared and analyzed. Using a mandibular bone as an example, several key modeling steps are then discussed in detail, and the FEM calculation was conducted. Numerical calculation results based on the models derived from the two methods, including stress, strain, and displacement, are compared and evaluated in relation to accuracy and validity. Moreover, a comprehensive comparison of the two solutions is listed. The parametric surface based method is more helpful when using powerful design tools in computer-aided design (CAD) software, but the triangular mesh based method is more robust and efficient.

  11. Point cloud registration from local feature correspondences-Evaluation on challenging datasets.

    PubMed

    Petricek, Tomas; Svoboda, Tomas

    2017-01-01

    Registration of laser scans, or point clouds in general, is a crucial step of localization and mapping with mobile robots or in object modeling pipelines. A coarse alignment of the point clouds is generally needed before applying local methods such as the Iterative Closest Point (ICP) algorithm. We propose a feature-based approach to point cloud registration and evaluate the proposed method and its individual components on challenging real-world datasets. For a moderate overlap between the laser scans, the method provides a superior registration accuracy compared to state-of-the-art methods including Generalized ICP, 3D Normal-Distribution Transform, Fast Point-Feature Histograms, and 4-Points Congruent Sets. Compared to the surface normals, the points as the underlying features yield higher performance in both keypoint detection and establishing local reference frames. Moreover, sign disambiguation of the basis vectors proves to be an important aspect in creating repeatable local reference frames. A novel method for sign disambiguation is proposed which yields highly repeatable reference frames.

  12. Quantification and characterisation of fatty acid methyl esters in microalgae: Comparison of pretreatment and purification methods.

    PubMed

    Lage, Sandra; Gentili, Francesco G

    2018-06-01

    A systematic qualitative and quantitative analysis of fatty acid methyl esters (FAMEs) is crucial for microalgae species selection for biodiesel production. The aim of this study is to identify the best method to assess microalgae FAMEs composition and content. A single-step method, was tested with and without purification steps-that is, separation of lipid classes by thin-layer chromatography (TLC) or solid-phase extraction (SPE). The efficiency of a direct transesterification method was also evaluated. Additionally, the yield of the FAMEs and the profiles of the microalgae samples with different pretreatments (boiled in isopropanol, freezing, oven-dried and freeze-dried) were compared. The application of a purification step after lipid extraction proved to be essential for an accurate FAMEs characterisation. The purification methods, which included TLC and SPE, provided superior results compared to not purifying the samples. Freeze-dried microalgae produced the lowest FAMEs yield. However, FAMEs profiles were generally equivalent among the pretreatments. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Comparing different methods for fast screening of microbiological quality of beach sand aimed at rapid-response remediation.

    PubMed

    Testolin, Renan C; Almeida, Tito C M; Polette, Marcus; Branco, Joaquim O; Fischer, Larissa L; Niero, Guilherme; Poyer-Radetski, Gabriel; Silva, Valéria C; Somensi, Cleder A; Corrêa, Albertina X R; Corrêa, Rogério; Rörig, Leonardo R; Itokazu, Ana Gabriela; Férard, Jean-François; Cotelle, Sylvie; Radetski, Claudemir M

    2017-05-15

    There is scientific evidence that beach sands are a significant contributor to the pathogen load to which visitors are exposed. To develop beach quality guidelines all beach zones must be included in microbiological evaluations, but monitoring methods for beach sand quality are relatively longstanding, expensive, laborious and require moderate laboratory infrastructure. This paper aimed to evaluate the microorganism activity in different beach zones applying and comparing a classical method of membrane filtration (MF) with two colorimetric screening methods based on fluorescein (FDA) and tetrazolium (TTC) salt biotransformation to evaluate a new rapid and low-cost method for beach sand microbiological contamination assessments. The colorimetric results can help beach managers to evaluate rapidly and at low cost the microbiological quality of different beach zones in order to decide whether remedial actions need to be adopted to prevent exposure of the public to microbes due to beach sand and/or water contamination. Copyright © 2017. Published by Elsevier Ltd.

  14. Ship Detection Based on Multiple Features in Random Forest Model for Hyperspectral Images

    NASA Astrophysics Data System (ADS)

    Li, N.; Ding, L.; Zhao, H.; Shi, J.; Wang, D.; Gong, X.

    2018-04-01

    A novel method for detecting ships which aim to make full use of both the spatial and spectral information from hyperspectral images is proposed. Firstly, the band which is high signal-noise ratio in the range of near infrared or short-wave infrared spectrum, is used to segment land and sea on Otsu threshold segmentation method. Secondly, multiple features that include spectral and texture features are extracted from hyperspectral images. Principal components analysis (PCA) is used to extract spectral features, the Grey Level Co-occurrence Matrix (GLCM) is used to extract texture features. Finally, Random Forest (RF) model is introduced to detect ships based on the extracted features. To illustrate the effectiveness of the method, we carry out experiments over the EO-1 data by comparing single feature and different multiple features. Compared with the traditional single feature method and Support Vector Machine (SVM) model, the proposed method can stably achieve the target detection of ships under complex background and can effectively improve the detection accuracy of ships.

  15. A Comparison of seismic instrument noise coherence analysis techniques

    USGS Publications Warehouse

    Ringler, A.T.; Hutt, C.R.; Evans, J.R.; Sandoval, L.D.

    2011-01-01

    The self-noise of a seismic instrument is a fundamental characteristic used to evaluate the quality of the instrument. It is important to be able to measure this self-noise robustly, to understand how differences among test configurations affect the tests, and to understand how different processing techniques and isolation methods (from nonseismic sources) can contribute to differences in results. We compare two popular coherence methods used for calculating incoherent noise, which is widely used as an estimate of instrument self-noise (incoherent noise and self-noise are not strictly identical but in observatory practice are approximately equivalent; Holcomb, 1989; Sleeman et al., 2006). Beyond directly comparing these two coherence methods on similar models of seismometers, we compare how small changes in test conditions can contribute to incoherent-noise estimates. These conditions include timing errors, signal-to-noise ratio changes (ratios between background noise and instrument incoherent noise), relative sensor locations, misalignment errors, processing techniques, and different configurations of sensor types.

  16. An investigation of the impact of using different methods for network meta-analysis: a protocol for an empirical evaluation.

    PubMed

    Karahalios, Amalia Emily; Salanti, Georgia; Turner, Simon L; Herbison, G Peter; White, Ian R; Veroniki, Areti Angeliki; Nikolakopoulou, Adriani; Mckenzie, Joanne E

    2017-06-24

    Network meta-analysis, a method to synthesise evidence from multiple treatments, has increased in popularity in the past decade. Two broad approaches are available to synthesise data across networks, namely, arm- and contrast-synthesis models, with a range of models that can be fitted within each. There has been recent debate about the validity of the arm-synthesis models, but to date, there has been limited empirical evaluation comparing results using the methods applied to a large number of networks. We aim to address this gap through the re-analysis of a large cohort of published networks of interventions using a range of network meta-analysis methods. We will include a subset of networks from a database of network meta-analyses of randomised trials that have been identified and curated from the published literature. The subset of networks will include those where the primary outcome is binary, the number of events and participants are reported for each direct comparison, and there is no evidence of inconsistency in the network. We will re-analyse the networks using three contrast-synthesis methods and two arm-synthesis methods. We will compare the estimated treatment effects, their standard errors, treatment hierarchy based on the surface under the cumulative ranking (SUCRA) curve, the SUCRA value, and the between-trial heterogeneity variance across the network meta-analysis methods. We will investigate whether differences in the results are affected by network characteristics and baseline risk. The results of this study will inform whether, in practice, the choice of network meta-analysis method matters, and if it does, in what situations differences in the results between methods might arise. The results from this research might also inform future simulation studies.

  17. Meta-analysis of haplotype-association studies: comparison of methods and empirical evaluation of the literature

    PubMed Central

    2011-01-01

    Background Meta-analysis is a popular methodology in several fields of medical research, including genetic association studies. However, the methods used for meta-analysis of association studies that report haplotypes have not been studied in detail. In this work, methods for performing meta-analysis of haplotype association studies are summarized, compared and presented in a unified framework along with an empirical evaluation of the literature. Results We present multivariate methods that use summary-based data as well as methods that use binary and count data in a generalized linear mixed model framework (logistic regression, multinomial regression and Poisson regression). The methods presented here avoid the inflation of the type I error rate that could be the result of the traditional approach of comparing a haplotype against the remaining ones, whereas, they can be fitted using standard software. Moreover, formal global tests are presented for assessing the statistical significance of the overall association. Although the methods presented here assume that the haplotypes are directly observed, they can be easily extended to allow for such an uncertainty by weighting the haplotypes by their probability. Conclusions An empirical evaluation of the published literature and a comparison against the meta-analyses that use single nucleotide polymorphisms, suggests that the studies reporting meta-analysis of haplotypes contain approximately half of the included studies and produce significant results twice more often. We show that this excess of statistically significant results, stems from the sub-optimal method of analysis used and, in approximately half of the cases, the statistical significance is refuted if the data are properly re-analyzed. Illustrative examples of code are given in Stata and it is anticipated that the methods developed in this work will be widely applied in the meta-analysis of haplotype association studies. PMID:21247440

  18. Does counselling improve uptake of long-term and permanent contraceptive methods in a high HIV-prevalence setting?

    PubMed Central

    Dudley, Lilian; Makumucha, Courage; Dlamini, Phatisizwe; Moyo, Sihle; Bhembe, Sibongiseni

    2015-01-01

    Abstract Background Studies have shown a reduced uptake of contraceptive methods in HIV-positive women of childbearing age, mainly because of unmet needs that may be a result of poor promotion of available methods of contraception, especially long-term and permanent methods (LTPM). Aim To compare the uptake of contraceptive methods, and particularly LTPM, by HIV-positive and HIV-negative post-partum mothers, and to assess the effects of counselling on contraceptive choices. Setting Three government district hospitals in Swaziland. Methods Interviews were conducted using a structured questionnaire, before and after counselling HIV-negative and HIV-positive post-partum women in LTPM use, unintended pregnancy rates, future fertility and reasons for contraceptive choices. Results A total of 711 women, of whom half were HIV-positive, participated in the study. Most (72.3% HIV-negative and 84% HIV-positive) were on modern methods of contraception, with the majority using 2-monthly and 3-monthly injectables. Intended use of any contraceptive increased to 99% after counselling. LTPM use was 7.0% in HIV-negative mothers and 15.3% in HIV-positive mothers before counselling, compared with 41.3% and 42.4% in HIV-negative and HIV-positive mothers, respectively, after counselling. Pregnancy intentions and counselling on future fertility were significantly associated with current use of contraception, whilst current LTPM use and level of education were significantly associated with LTPM post-counselling. Conclusion Counselling on all methods including LTPM reduced unmet needs in contraception in HIV-positive and HIV-negative mothers and could improve contraceptive uptake and reduce unintended pregnancies. Health workers do not always remember to include LTPM when they counsel clients, which could result in a low uptake of these methods. Further experimental studies should be conducted to validate these results. PMID:26842525

  19. Modified Peritoneal Dialysis Catheter Insertion: Comparison with a Conventional Method

    PubMed Central

    Lee, Yong Kyu; Yang, Pil-Sung; Park, Kyoung Sook; Choi, Kyu Hun

    2015-01-01

    Purpose The conventional trocar and cannula method in peritoneal dialysis (PD) catheter insertion has its limitation in clinical setting. The aim of this study was to compare a modified method for percutaneous PD catheter insertion with the conventional method, and demonstrate advantages of the modified method. Materials and Methods Patients at a single center who had percutaneous PD catheters inserted by nephrologists from January 2006 until September 2012, using either a modified method (group M) or the conventional trocar and cannula method (group C), were retrospectively analyzed, in terms of baseline characteristics, complications experienced up to 3 months after the procedure, and the suitability of the procedure for patients. Results Group M included 82 subjects, while group C included 66 cases. The overall early complication rate in group M (1.2%) was significantly lower than that in group C (19.7%) (p<0.001). The catheter revision rate during timeframe for early complications was significantly lower in group M (0%) than in group C (6.1%) (p=0.024). When comparing Procedure time (1 h 3 min±16 min vs. 1 h 36 min±19 min, p<0.01), immediate post-procedural pain (2.43±1.80 vs. 3.14±2.07, p<0.05), and post-procedure days until ambulation (3.95±1.13 days vs. 6.17±1.34 days, p<0.01), group M was significantly lower than group C. There was no significant difference in total hospitalization period (14.71±7.05 days vs. 13.86±3.7 days). Conclusion Our modified PD catheter insertion method shows its advantages in early complication rate, early complications revision rate, and the patients' conveniences. PMID:26069120

  20. Geometrically derived difference formulae for the numerical integration of trajectory problems

    NASA Technical Reports Server (NTRS)

    Mcleod, R. J. Y.; Sanz-Serna, J. M.

    1981-01-01

    The term 'trajectory problem' is taken to include problems that can arise, for instance, in connection with contour plotting, or in the application of continuation methods, or during phase-plane analysis. Geometrical techniques are used to construct difference methods for these problems to produce in turn explicit and implicit circularly exact formulae. Based on these formulae, a predictor-corrector method is derived which, when compared with a closely related standard method, shows improved performance. It is found that this latter method produces spurious limit cycles, and this behavior is partly analyzed. Finally, a simple variable-step algorithm is constructed and tested.

Top